This page is a mirror of Tepples' nesdev forum mirror (URL TBD).
Last updated on Oct-18-2019 Download

NES and framerate

NES and framerate
by on (#160577)
I was thinking about framerate today.

Exactly what framerate did the NES operate at?

Okay, the short answer is probably going to be 30 fps or 29.97 fps (assuming we are talking about the NTSC version) but I'm really interested in a lot of gooey details that come along with this subject.

The way frames were handled back then was a whole different world from today. Today games can drop frames but operate at the same speed, or even gain extra frames if they are not capped. But I understand they didn't have the same timing mechanisms back then. When there were too many objects on the screen, the game slowed down. (I remember that.) So, what is going on when that happens? I would guess the system is just sending the same image to the TV until it gets a new one, but I wonder if there are more details to it than that.
And what about on the flip side? What happens when a frame gets rendered with time to spare?

And how does a game handle differently between NTSC and PAL?
I recall a while back I was trying to reverse-engineer Mega Man's mechanics to try to make a platformer I was making feel more comfortable. I noticed that no object ever seemed to go faster than 16 pixels per frame. This made sense, since the game would presumably not track an objects path but just what it is colliding with on a given frame. Thus, if the player fell faster than 16 pixels per frame, the player could potentially fall right past an entire block, falling right through it.
But how do per-frame calculations work when the system is outputting a different framerate?

And if I am mistaken in my assumptions about how a computer processes data between rendering frames, please elucidate me.
Re: NES and framerate
by on (#160578)
Marscaleb wrote:
Okay, the short answer is probably going to be 30 fps or 29.97 fps (assuming we are talking about the NTSC version)

Actually, the short answer is 60 fps. NTSC video runs at 29.97 fps, but each frame consists of 2 fields, so there are a total of 59.94 images per second. The NES doesn't output exactly that though, it actually goes a little past 60 fps.

Quote:
Today games can drop frames but operate at the same speed, or even gain extra frames if they are not capped.

Games may not always be perfectly synced to the refresh rate, but the video itself always has a constant frame rate. This was true of old analog video standards back in the day and is true of HDMI standards today.

Quote:
When there were too many objects on the screen, the game slowed down. (I remember that.)

It still happens with today's games, when there are a lot of objects, polygons and effects.

Quote:
So, what is going on when that happens?

Since the video runs at a constant rate, the video chip has to deliver new pictures at a constant rate. This means that the console only has 1/60 of a second to process everything and prepare a new image. If there's too much going on in the game, and a new frame starts before the game finishes processing the last one, it simply skips updating the video that one time, since the data isn't ready. This results in the previous image being displayed again.

Quote:
What happens when a frame gets rendered with time to spare?

Usually, nothing. It waits for the next frame to start before processing another logic frame. It should be possible to get started on the next frame, but that would be somewhat tricky because you can't overwrite the data that's already been computed before it's sent to the screen, so a more complex video update system has to be used, and it'd also require more RAM.

Quote:
And how does a game handle differently between NTSC and PAL?

Most commercial games just run slower on PAL. Usually only the frequencies of the musical notes are adjusted (not even the tempo).

Quote:
But how do per-frame calculations work when the system is outputting a different framerate?

Games always process one logic frame after the next, so any restrictions will be constant. NES games never try to compensate for missed frames.
Re: NES and framerate
by on (#160583)
Also, the NES displays non-interlaced 60FPS video, people have since referred to it as '240p'. On some TVs, you get black scanlines between each line, since it doesn't update the odd-numbered rows, while on other TVs, the video beam is thick enough to cover up the black and you don't get this artifact.
Re: NES and framerate
by on (#160589)
On the NES, writing a game that can slow down gracefully is a little bit more difficult than writing a game that stays at 60 fps. Something as simple as a status bar usually requires a carefully timed rendering state manipulation on every frame. Modern systems are completely different in this respect, they buffer a whole screen, and display it all at once. On old consoles the video signal goes immediately to the TV as it's being drawn, there is no memory of pixels to hold a screen until it's finished.

60 fps isn't about processing power, it's about maintaining a budget. Atari 2600 or NES games run at 60 because they worked to keep everything that is happening simple enough to fit that timing budget. The same applies to any modern game that runs at 60.
Re: NES and framerate
by on (#160618)
Okay, this is bugging the crap out of me.
TV's don't operate at 60 fps. This is a misconception of how interlaced signals work. They do not send two different frames, they have one frame that is divided into two parts. While you could *sort of* claim that's 60 images per second, it's actually 60 half-images. The images don't even look right by themselves; they are incomplete.
Re: NES and framerate
by on (#160619)
Marscaleb wrote:
TV's don't operate at 60 fps. This is a misconception of how interlaced signals work. They do not send two different frames, they have one frame that is divided into two parts.

I'm not an expert on TVs, but you shouldn't be talking about misconceptions when you clearly don't know much about how they work either. AFAIK, the typical CRT will do a full scan from the top left to the bottom right of the screen to draw one field, and go back to the top left to draw the second field (slightly offset from the first), so the beam does indeed scan the screen 60ish times per second.

Quote:
While you could *sort of* claim that's 60 images per second, it's actually 60 half-images. The images don't even look right by themselves; they are incomplete.

All the NES needs for each frame is a "half-image", since its resolution is half of the NTSC resolution. What these old consoles did was trick TVs that were made for displaying interlaced images into displaying progressive images. AFAIK, they somehow avoided the "slightly offset" part of interlaced video so that even and odd fields would be drawn exactly in the same place (this is what created the "scanlines" effect in some TVs, like Dwedit pointed out), effectively creating 60fps video. This confuses the hell out of VCRs and video capture cards, which insist on turning the progressive video that comes out of old video game consoles into interlaced video.
Re: NES and framerate
by on (#160620)
Marscaleb wrote:
Okay, this is bugging the crap out of me.
TV's don't operate at 60 fps. This is a misconception of how interlaced signals work. They do not send two different frames, they have one frame that is divided into two parts. While you could *sort of* claim that's 60 images per second, it's actually 60 half-images. The images don't even look right by themselves; they are incomplete.

There are sixty refreshes per second. Depending on the context, this may imply sixty or thirty frames per second.

In NTSC (interlaced, sometimes called 480i), a "frame" is a combination of two fields, or what you're calling a half-image. That's sixty fields per second, combining into thirty complete frames per second. For a progressive signal at the same rate ("240p"), though, we do not combine fields, so one field is one entire frame.

tokumaru wrote:
This confuses the hell out of VCRs and video capture cards, which insist on turning the progressive video that comes out of old video game consoles into interlaced video.

Actually, my old '90s VCR records progressive 240p video just great. It plays back in the correct resolution as well.
Re: NES and framerate
by on (#160621)
mikejmoffitt wrote:
Actually, my old '90s VCR records progressive 240p video just great. It plays back in the correct resolution as well.

I just remember my game recordings looking weird back then, not as smooth as live play. I haven't used a VCR in over 14 years, so I can't say for sure what was going on. With the knowledge I have today, I just assumed it was an progressive vs. interlaced thing, but I could be wrong.
Re: NES and framerate
by on (#160623)
Relevant Wikipedia article:

Wikipedia wrote:
In the 1970s, computers and home video game systems began using TV sets as display devices. At that point, a 480-line NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal that made each video field scan directly on top of the previous one, rather than each line between two lines of the previous field. This marked the return of progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this.
Re: NES and framerate
by on (#160624)
Or perhaps it's because VHS restricted chroma bandwidth to the equivalent of 40 pixels per scanline. That's fine for an Atari 2600 but not much else.

Speaking of Atari 2600, that system gives the programmer enough control to send the sync pulse half a scanline late, which is how standard composite video signals the TV to draw the other field. There are interlaced demos for the 2600, but the pixel aspect ratio of the resulting video mode (24:7) is so wide that I question whether it's worth it.
Re: NES and framerate
by on (#160626)
tokumaru wrote:
I'm not an expert on TVs, but you shouldn't be talking about misconceptions when you clearly don't know much about how they work either. AFAIK, the typical CRT will do a full scan from the top left to the bottom right of the screen to draw one field, and go back to the top left to draw the second field (slightly offset from the first), so the beam does indeed scan the screen 60ish times per second.


Field =/= frame.
Re: NES and framerate
by on (#160627)
Marscaleb wrote:
Okay, this is bugging the crap out of me.
TV's don't operate at 60 fps. This is a misconception of how interlaced signals work. They do not send two different frames, they have one frame that is divided into two parts. While you could *sort of* claim that's 60 images per second, it's actually 60 half-images. The images don't even look right by themselves; they are incomplete.


You're confusing fields with frames. A field, half image of data, is only a field if it's actually interleaved with previous field of graphics. Fields exist at ~60fps (or 59.97) for NTSC. Two fields together, in succession, can represent a full frame at 30hz temporal resolution to the human eye, or if need be show a temporal resolution of 60hz but at half the update rate.

But when each "field" occupies the same overlapping space as the previous one, it becomes a "frame". It no longer serves the purpose of a field. Instead of combing interlace effects, you have solid frame updates... but with a side of effect often labeled as "scanlines". These are permanent gaps between the scanlines of the TV, because it only has one beam height. To minimize the field combing effects of interlaced video of TV broadcast and such, the beam was made wide so that the lines of each field would overlap - it kinda hides it. If you view that video on a PC, for example, the combing effect looks very pronounced. The scanline option in emulators tends to be wrong because it assumes an equal gap width as the beam height. Things like 25% and 50% help get that classic look, but its not entirely accurate.

TVs weren't the only display devices to work like this. Even some computer monitors/graphic modes used line skip to show lower resolution modes (interlaced vs progressive scan).
Re: NES and framerate
by on (#160628)
Marscaleb wrote:
Field =/= frame.

Not when we're talking about video game consoles that use fields as if they were frames. :wink:
Re: NES and framerate
by on (#160629)
Interlacing is specifically a kind of lossy compression. It works by discarding half the vertical resolution information on each field. The resultant video is not half the field rate. If it were, a so-called "weave" deinterlacer would look just fineā€”and it doesn't.

The resulting (NTSC) video is still 60fps and still 480 scanlines high, even though each individual field is 240 scanlines high and each frame is 1/30th of a second. To reconstruct the information discarded in the compression/interlacing step requires advanced image processing.
Re: NES and framerate
by on (#160633)
lidnariq wrote:
The resulting (NTSC) video is still 60fps

Yeah, from what I've been reading, in proper interlaced video, the 2 fields do NOT depict the same moment in time, they're actually 1/60 of a second apart, just like in video game footage. Each field was supposed to be 480 scanlines, but since there's no bandwidth for that, even lines are discarded in one field, and odd lines are discarded in the next. Video game signals differ in that they don't do the even/odd thing, all lines are drawn as if they're even (or odd?), no matter the field.
Re: NES and framerate
by on (#160634)
Standard interlaced NTSC consists of 262 and a half scanlines every "frame"; due to the extra half-scanline, even fields and odd fields are drawn slightly offset from one another, producing interlaced video at around 29.97 frames per second.

On the NES, however, the PPU outputs exactly 262 scanlines per "frame", and televisions are tolerant enough of this that they display all of the fields in exactly the same place for progressive, non-interlaced video at about 60.1 frames per second.
Re: NES and framerate
by on (#160635)
Usually 480i fields are blurred a bit vertically to minimize interline Image twitter when a scanline is noticeably brighter or darker than its neighbors.

Image
Interline twitter

From left to right: original, interlaced, line doubled. Top row: not blurred; bottom row: blurred.

That's why Super Smash Bros. Melee and Brawl have the deflicker switch to enable or disable the GameCube/Wii video DAC's vertical blur feature. Turning it on blurs everything and is good for CRTs; turning it off is better for LCDs that know how to weave when needed.
Re: NES and framerate
by on (#160636)
Marscaleb wrote:
Field =/= frame.

Progressive =/= interlaced.

What do you find confusing about this? Are you just used to modern LCD TVs that only implement 480i, and incorrectly interlace a 240p signal like the NES into 30 hz frames?
Re: NES and framerate
by on (#160690)
Guys, be gentle.

If you have previously researched how video signals work before starting to research how the NES works, then you will probably only have found information about the official NTSC format. The non-standard "240p" format the NES uses isn't often mentioned alongside official NTSC descriptions.

When I first learned about the NES's 60 Hz frames years ago, I went through the same confusion Marscaleb is going through. The key point to help me understand 60 Hz frames really do exist was what Quietust described:

Quietust wrote:
Standard interlaced NTSC consists of 262 and a half scanlines every "frame"; due to the extra half-scanline, even fields and odd fields are drawn slightly offset from one another, producing interlaced video at around 29.97 frames per second.

On the NES, however, the PPU outputs exactly 262 scanlines per "frame", and televisions are tolerant enough of this that they display all of the fields in exactly the same place for progressive, non-interlaced video at about 60.1 frames per second.

I didn't realize if you removed the "half scanline", that TVs would still work with the resulting non-standard signal and it would cause the fields to line up.

In other words, the key point is: A standard NTSC signal does use ~60 Hz fields and ~30 Hz frames, but the NES uses a non-standard signal that causes the fields to line up so they effectively become individual frames.

Maybe a 240p wiki page would be helpful so we could refer any future questions about frame rates to a well-written and easy to understand answer.
Re: NES and framerate
by on (#160696)
I added a link to 240p video in NTSC video.
Re: NES and framerate
by on (#160699)
Bavi_H wrote:
If you have previously researched how video signals work before starting to research how the NES works, then you will probably only have found information about the official NTSC format. The non-standard "240p" format the NES uses isn't often mentioned alongside official NTSC descriptions.

It's not even just about 240p vs 480i, it's really just about CRTs offering 60 hz updates. Even game systems that correctly implemented the interlacing offset still often ran games at 60 fps. You get smoother animation if you treat each field as a frame; it does blend with the previous frame, of course, but there's still a temporal offset here that increases the potential resolution of motion. On a CRT you get a more responsive, and smoother game if you treat it as 60 fps, so it's very common to see games targeting 60 through the whole SD era. It's not just the "nonstandard" 240p systems.

Most post-CRT stuff that supports 480i is focused on playing video, not video games, which tends to ignore that temporal offset in favour of a nice static 30 fps framerate. This is a better approach for source video that really was 30 fps, and TV manufacturers think their customers care more about that kinda stuff than old video games. They're probably right in a majority rules sense, but it sucks for people into old games.
Re: NES and framerate
by on (#164828)
I was looking through this thread again and it occurs to me that I never thanked tokumaru for his helpful information.

Thank you for explaining how the NES handles its framerate.

Especially after reading it again, I think I understand.
The NES hardware, I assume, calls a frame to be drawn at an exact time interval of 60 per second. If a frame has finished all of its calculations, the system basically idles until this timer is triggered. (I would guess it gets everything ready, waits for the timer, draws the frame, and then starts the next frame's calculations. But if it draws the frame immediately after the calculations and then waits to send the data, then the end result is still the same. The game still doesn't start calculating how objects have moved and stuff until an internal timer tells it to.)
If a frame hasn't finished it's calculations, said calculations get interrupted, the frame gets drawn, (basically just repeating the last frame,) and then the calculations resume, and then (ideally) get finished with plenty of time until the next timer is called, sitting idle until it does.
So when the NES slows down, the game actually slows to exactly half of its normal speed. (Assuming we're talking about a normal slow-down situation where a finished game just slightly exceeded its expected load.)

Do I have that right?

Also, just curious. You said that the system operates slightly faster than 60 frames per second. So... Exactly how often does a game get called to run its calculations? And, how does that not cause problems with the framerate?
Re: NES and framerate
by on (#164829)
rainwarrior wrote:
Most post-CRT stuff that supports 480i is focused on playing video, not video games, which tends to ignore that temporal offset in favour of a nice static 30 fps framerate.

Anything shown live or shot on videotape incorporates that temporal offset. This includes news, sports, and old soap operas. Videotaped soap operas, in fact, created an association between high motion and low production values that made the 48 fps film The Hobbit: An Unexpected Journey look "cheaper" than the majority of live-action films, which are shot at 24 fps.

Marscaleb wrote:
If a frame hasn't finished it's calculations, said calculations get interrupted, the frame gets drawn, (basically just repeating the last frame,) and then the calculations resume, and then (ideally) get finished with plenty of time until the next timer is called, sitting idle until it does.
So when the NES slows down, the game actually slows to exactly half of its normal speed. (Assuming we're talking about a normal slow-down situation where a finished game just slightly exceeded its expected load.)

Do I have that right?

Yes.

Regarding "slightly faster": The NES and Super NES PPU operate at 60.099 fps, which is slightly faster than the NTSC standard's 59.940 Hz field rate. This is for two reasons: the field is 262 lines long (compared to standard 262.5), and the scanline is 1364 master clock cycles long (compared to standard 1365). It's within tolerance of the old CRT SDTVs, but it's enough to mess up some newer TVs, upscalers, and video capture devices.
Re: NES and framerate
by on (#164832)
Why do modern TV sets act as though the interlaced fields come in distinct pairs, when almost every video source (including modern game systems for crying out loud) treat fields like individual frames?
Re: NES and framerate
by on (#164833)
Okay, one more thing is bugging me as I try to crunch some numbers.

A while back, I had gone frame-by-frame on some footage from some Mega Man NES games to try to reverse-engineer exactly how the player moves, so I could use that to build a game that felt more fun in its control.
What I found was that Mega Man fell one pixel faster every frame. (On the first frame, he fell one pixel, on the second, two, on the third, three, etc.) But this was under the assumption that the game was operating at 30 frames per second. If the game is actually operating at 60 frames per second, then this formula just doesn't quite work. Even if I try to calculate it as half a pixel every frame and then round things a bit, he would still be falling a small amount every 1/60 of a second. But I wasn't seeing that happen in the footage. I'm not sure where that footage is now, so I can't verify, but I would have totally noticed if the fields were showing fuzzy movement between them; I would not have even been able to count how far he was falling.

After running some math, the best way I can figure to calculate the velocity at 60 frames is to add one to the velocity each frame, but when the velocity is applied, reduce the value by one-fourth, truncating the value. This changes the applied velocity to increase by one every four frames, but produces the same sum total of movement over any length of time. But even so, it feel like a bit more of a convoluted method than what I had been figuring. Does this really seem like a reasonable method?
The only other option I can think of is that game might intentionally slow down the system to operate at 30 frames per second so it could run more calculations between frames. Not exactly inconceivable, but I don't know enough about the system's inner-workings to be able to say that it is reasonable.
Re: NES and framerate
by on (#164836)
The use of fixed-point (fractional) math for subpixel position and velocity in videogames is a well-documented thing, producing the results you see. It's more or less required for reasonable handling.
Re: NES and framerate
by on (#164838)
psycopathicteen wrote:
Why do modern TV sets act as though the interlaced fields come in distinct pairs, when almost every video source (including modern game systems for crying out loud) treat fields like individual frames?

Because Hollywood. Movies and scripted TV are shot on film or digital film at 24 fps and displayed on NTSC TV with 2:3 pulldown: one frame in 2 fields, one frame in 3 fields, etc.
Re: NES and framerate
by on (#164840)
tepples wrote:
psycopathicteen wrote:
Why do modern TV sets act as though the interlaced fields come in distinct pairs, when almost every video source (including modern game systems for crying out loud) treat fields like individual frames?

Because Hollywood. Movies and scripted TV are shot on film or digital film at 24 fps and displayed on NTSC TV with 2:3 pulldown: one frame in 2 fields, one frame in 3 fields, etc.


Yes, even DVD movies rely on the fact that TV's are not supposed to interpret interlaced signals as 30fps.
Re: NES and framerate
by on (#164842)
Marscaleb wrote:
Thank you for explaining how the NES handles its framerate.

No problem. I'm usually not very good at explaining things, so I'm glad that the information I posted was useful to you.

Your understanding is mostly correct, but the way you wrote it down makes it seem like the system is more involved than it actually is, while in fact this is purely a software thing.

Quote:
The NES hardware, I assume, calls a frame to be drawn at an exact time interval of 60 per second.

Yes, it interrupts the program 60 times per second, always when the vertical blank (the period during which the video memory can be freely accessed) starts.

Quote:
If a frame has finished all of its calculations, the system basically idles until this timer is triggered.

Here, for example, it's not the system that idles, it's the software. The system is still doing its thing as usual: the PPU is still rendering pixels, and the CPU is still reading instructions from memory and executing them, even if these instructions are just part of an idle loop instead of actual game stuff, but that's something the software decided to do, the system has no clue.

Quote:
If a frame hasn't finished it's calculations, said calculations get interrupted, the frame gets drawn, (basically just repeating the last frame,) and then the calculations resume, and then (ideally) get finished with plenty of time until the next timer is called, sitting idle until it does.

Pretty much. My only nitpick here is that the frame doesn't get drawn before the calculations can resume, those things happen concurrently. At the same time that the PPU is drawing the same frame it already drew once (since nothing in the video memory changed), the CPU resumes the game calculations. The CPU and the PPU run concurrently.

Quote:
So when the NES slows down, the game actually slows to exactly half of its normal speed. (Assuming we're talking about a normal slow-down situation where a finished game just slightly exceeded its expected load.)

Yes. However, I imagine that if a game is barely going over the processing budget, it could oscillate between finishing a frame in time and not, in which case you wouldn't get an even 30 frames per second.

Quote:
You said that the system operates slightly faster than 60 frames per second. So... Exactly how often does a game get called to run its calculations? And, how does that not cause problems with the framerate?

Like tepples said, the difference is minimal, definitely not noticeable to humans. Analog hardware usually adapts to the difference seamlessly, but with digital stuff there can be some undesirable side effects, like dropped frames or desynced audio.

Quote:
After running some math, the best way I can figure to calculate the velocity at 60 frames is to add one to the velocity each frame, but when the velocity is applied, reduce the value by one-fourth, truncating the value. This changes the applied velocity to increase by one every four frames, but produces the same sum total of movement over any length of time.

That's called sub-pixel movement. Games usually have more sub-pixel precision than 1/4 of a pixel, which would require only 2 bits. I believe the most common approach is to dedicate an entire byte to fractional positions and speeds (giving you a precision of 1/256 pixels), because that's easier/faster to work with. That way, to move half a pixel, you add 128. To move 1/4 of a pixel, add 64. Whenever the fractional part overflows (i.e. goes over 255) or underflows (i.e. goes below 0), this propagates onto the integer part of the coordinates, and only then you'll see an actual change on the sprite's position. The sub-pixel part is only internal to the engine, because the hardware can't handle changes smaller than 1 pixel.

Quote:
The only other option I can think of is that game might intentionally slow down the system to operate at 30 frames per second so it could run more calculations between frames. Not exactly inconceivable, but I don't know enough about the system's inner-workings to be able to say that it is reasonable.

Yes, this is a valid option, but very frowned upon.
Re: NES and framerate
by on (#164847)
tokumaru wrote:
Quote:
The only other option I can think of is that game might intentionally slow down the system to operate at 30 frames per second so it could run more calculations between frames.

Yes, this is a valid option, but very frowned upon.

An option that almost every PlayStation 1 and Nintendo 64 game and many games for later systems used. NES games ported by Micronics were even slower.

Or the game could run different parts of the simulation at different frame rates. Super Mario Bros. runs a lot of tasks on a 21-frame timer (just under 3 Hz). Thwaite and RHDE run a bunch of things on a 6-frame (10 fps) or 12-frame (5 fps) cycle. For example, both games move the units once every 12 frames (5 fps), which is slightly jerky but acceptable for characters that are so small.

I remember reading somewhere that Mega Man does a lot of things on an even-odd basis: process even-numbered enemies on even frames and odd-numbered enemies on odd frames. Thwaite does the same with even- and odd-numbered explosions.
Re: NES and framerate
by on (#164849)
Ooh, are we talking about framerates of games now?

Wizards and Warriors III ran at 30FPS, but interpolated the player's motion to make it 60FPS.
Re: NES and framerate
by on (#164850)
Solstice runs as 30 fps, double buffering its CHR-RAM rendering. When there's multiple enemies moving on screen it staggers which ones get updated, so they're even moving at 15 fps generally.

I dunno how much 30 fps gaming is "frowned upon", really. There's certainly a crowd that appreciates 60 fps gaming (myself included), but I think the majority of PS3 / 360 games were aiming for 30 fps. For an Atari 2600 game, since you had to spend most of every frame doing raster effects, and you had very little RAM for a double buffer, 60 fps was a sensible default. For NES, I think 30 fps could be a very reasonable choice, but it's extremely uncommon. I dunno if this was really due to attitudes about framerate, or maybe it just wasn't an idea that many developers thought of. Nowadays we have a vast library of 30 fps games to demonstrate its viability, but you didn't at that time. It might just have been "outside the box" thinking for the NES era.
Re: NES and framerate
by on (#164944)
I was not aware that fixed-point was a thing.

So the games can store things in fractions of 1/256 very easily. Good to know. And logically, because of how the screen is drawn, the character will not move until they have reached a full 1 pixel to move. I suppose this applies to collision as well as rendering. Hmmm, I almost had an easy out for the collision issues I am solving in game. But its good to know that characters can *sorta* move in fractions of pixels, because it is looking like that will have to be how I am moving in my game.

So what I am seeing with Mega Man is that his velocity is increased by 64/256 ever frame at 60 frames per second.
I'm going to make some changes to my code.
Re: NES and framerate
by on (#164947)
Marscaleb wrote:
So what I am seeing with Mega Man is that his velocity is increased by 64/256 ever frame at 60 frames per second.

Looks like it, according to this document:

TASVideos wrote:
Every frame, 00.40 is subtracted from his Y speed (0.25 in decimal).

For some reason, "Negative Y speed means falling, positive Y speed means ascending"... I'd expect the opposite, but maybe it's because the game subtracts the vertical speed instead of adding it.