In reply to Dauphin:
> Standard computer screen these days is 60hz and 1080p or higher - far superior than these '60s era (?) broadcast formats. Max that shit out bro.
From what I've read it isn't that simple, either that or I am getting confused. From what I've read, or can gather is that it is something to do with the electricity of the two seperate countries. America has its electricity frequency, or what ever the correct terminology is, at one setting. And the UK has its electricity set to another setting which is why when you capture footage, you can either get the lighting or TV's to flicker or not. This all depends on the frequency of NTFS or PAL, I think its 60hz or 50hz which equates to the differing frame rates of the USA and the UK.
I may be confused by all this electronics as its way above my pay grade but I've witnessed the difference setting my camera to NTFS and PAL. But does this all really matter when I have a monitor that refreshes at 60hz but what I read about PAL, should it not be 50hz?
What confuses me more is that just about all the camera recording toys I have from my phone to, gopro to my camera, they all are preset to 30fps. Is this because the American system has won through, or this is now the accepted standard and PAL/25fps is now defunct in the world of computers and YouTube?
I remember creating a movie for some friends at work and buggered up the capture frames per second with the eventual project which was rendered at another frame rate. I did not notice untill all the hard work was don and I was left with a 8 minute movie with stuttering every second because of the extra frame thrown in at rendering. How I did laugh at this error.
The frames per second, isnt really my problem, its more so the NTFS and PAL. Does it even matter any more now that we hardly ever watch stuff that we make on a TV?