Foreword: Recently I became really frustrated with how overpriced and featureless all of the video capture devices are. The worst thing about this is that nobody has ever made an open source video capture device. You can say that these devices are expensive, because stuff has to be licensed(like a usage of an HDMI connector), but that doesn't explain the cost of console capture devices. And so I thought: it's probably time that we stop all of those capture card makers from ripping us off and make an open source solution that anyone with soldering skills can assemble for cheap. I will provide my ideas on how to accomplish this on the example of the Nintendo 3DS. But before that, let's compare all of the existing methods that we have for capturing 3DS footage.
Here's a list of them(sorted from highest to lowest video quality):
1. A commercial capture card(like Loopy or Katsukity)
PROS:
+High quality video output(full resolution without compression at constant 60fps and even the ability to record stereoscopic 3d!)
+As this is a hardware solution it will work without any problems with any software version
CONS:
-Way overpriced for what they are(200$ for installation into your own console (plus shipping to Nippon), 500+$ if you want a new premodded unit), especially considering that pretty much all the components are widely and cheaply available
-Modifying hardware voids warranty and has a small chance of bricking your system
-No DIY kits(seriously why not give people the kit for less money, so they can install it themselves. Many people like soldering stuff)
2. Software streaming over WiFi(e.g. NTR or HorizonM)
PROS:
+Easy to set up and doesn't require any hardware modifications
+Works wirelessly, eliminating cable mess(though that's barely a pro)
CONS:
-Video quality is ABOMINABLE. JPEG sucks and that's not a thing we can change, since 3DS's amazing IEEE 802.11n/g compliant WiFi chip(54Mbit/s max) makes it impossible to transfer anything of a higher quality(for reference: transfering 3DS's video as uncompressed RGB888 will take up ≅275Mbit/s and that's without considering packet overhead and other required data)
-Even when on lowest setting and holding my N3DSXL right up the antennas of my PCI-E WiFi card, I still can't achieve anything close to 60fps(thought at that point 20% of the incoming data is the actual image and the other 80%+ is just noisy junk)
3. Just put a camera in front of the screen
(do I really need to explain how crappy this method is???)
So with that out of the way, let me tell you about my ideas on a better way for capturing 3DS footage.
Note: these are just my ideas. I can be(and probably am) wrong with some of these.
From what I've read on 3dbrew, 3DS sends it's video signal to the top screen as RGB888, HSync, VSync and Clock. It's not hard to tap into these pins, as there are test points all over the mainboard.
1. Just replicate the Katsukity board (the most realistic one)
Well, the simplest thing to do is to find cheap, reliable and widely available components, do some coding and grab the soldering iron. The list of required components: an FPGA that is fast enough(or we can just use the same thing as in Katsukity boards), a USB 2.0/3.0 controller, a PCB(although we can just use a piece of perfboard(assuming that we can find a DIP style FPGA and USB controller IC)), some complementary components (caps, resistors, a micro USB port). I, sadly, don't have any experience programming FPGA's (as I never owned or seen one in my life), but it probably wouldn't be too hard to convert 8-bit RGB to YUV for use with UVC protocol or to just make our own software to decode RGB888 on the PC side.
2. VGA method (I'm not sure about this this one)
Note: this is not really a capture method. It would just allow us to play 3DS games on a VGA monitor.
So, at one point I thought: why can't I just solder an 8-bit DAC to every color's 8 pins and shove those three signals, ground and already existing HSync and VSync into a VGA plug to get a component video out of the system. This, of course, can be used to capture video with a capture device(but they're also 200$ a piece, so you're just better off sending your 3DS to Japan to get it hardmodded)
3. Draw call tracer (most unlikely)
Another crazy idea of mine is to make a homebrew, which would log PICA200 draw call and either send them over WiFi(unlikely, see above for reasons) or save it as a file to the SD card. And then we could replay these recordings(possibly with a higher internal resolution!) with a program based on Citra's video core. Citra team actually made something similar. It is called CiTrace, although it only records draw calls from the emulator, but as of any recent(and even quite old) commit Citra can no longer run CiTrace player and just outright crashes.
In conclusion: I hope that at least a few of my ideas made some sence. I'm interested in hearing from you guys how my ideas were wrong(or how much I suck at english) or what would be your idea for a better capture method on the Nintendo 3DS. (And I also hope that this is the right section of the forum to post this in)
Here's a list of them(sorted from highest to lowest video quality):
1. A commercial capture card(like Loopy or Katsukity)
PROS:
+High quality video output(full resolution without compression at constant 60fps and even the ability to record stereoscopic 3d!)
+As this is a hardware solution it will work without any problems with any software version
CONS:
-Way overpriced for what they are(200$ for installation into your own console (plus shipping to Nippon), 500+$ if you want a new premodded unit), especially considering that pretty much all the components are widely and cheaply available
-Modifying hardware voids warranty and has a small chance of bricking your system
-No DIY kits(seriously why not give people the kit for less money, so they can install it themselves. Many people like soldering stuff)
2. Software streaming over WiFi(e.g. NTR or HorizonM)
PROS:
+Easy to set up and doesn't require any hardware modifications
+Works wirelessly, eliminating cable mess(though that's barely a pro)
CONS:
-Video quality is ABOMINABLE. JPEG sucks and that's not a thing we can change, since 3DS's amazing IEEE 802.11n/g compliant WiFi chip(54Mbit/s max) makes it impossible to transfer anything of a higher quality(for reference: transfering 3DS's video as uncompressed RGB888 will take up ≅275Mbit/s and that's without considering packet overhead and other required data)
-Even when on lowest setting and holding my N3DSXL right up the antennas of my PCI-E WiFi card, I still can't achieve anything close to 60fps(thought at that point 20% of the incoming data is the actual image and the other 80%+ is just noisy junk)
3. Just put a camera in front of the screen
(do I really need to explain how crappy this method is???)
So with that out of the way, let me tell you about my ideas on a better way for capturing 3DS footage.
Note: these are just my ideas. I can be(and probably am) wrong with some of these.
From what I've read on 3dbrew, 3DS sends it's video signal to the top screen as RGB888, HSync, VSync and Clock. It's not hard to tap into these pins, as there are test points all over the mainboard.
1. Just replicate the Katsukity board (the most realistic one)
Well, the simplest thing to do is to find cheap, reliable and widely available components, do some coding and grab the soldering iron. The list of required components: an FPGA that is fast enough(or we can just use the same thing as in Katsukity boards), a USB 2.0/3.0 controller, a PCB(although we can just use a piece of perfboard(assuming that we can find a DIP style FPGA and USB controller IC)), some complementary components (caps, resistors, a micro USB port). I, sadly, don't have any experience programming FPGA's (as I never owned or seen one in my life), but it probably wouldn't be too hard to convert 8-bit RGB to YUV for use with UVC protocol or to just make our own software to decode RGB888 on the PC side.
2. VGA method (I'm not sure about this this one)
Note: this is not really a capture method. It would just allow us to play 3DS games on a VGA monitor.
So, at one point I thought: why can't I just solder an 8-bit DAC to every color's 8 pins and shove those three signals, ground and already existing HSync and VSync into a VGA plug to get a component video out of the system. This, of course, can be used to capture video with a capture device(but they're also 200$ a piece, so you're just better off sending your 3DS to Japan to get it hardmodded)
3. Draw call tracer (most unlikely)
Another crazy idea of mine is to make a homebrew, which would log PICA200 draw call and either send them over WiFi(unlikely, see above for reasons) or save it as a file to the SD card. And then we could replay these recordings(possibly with a higher internal resolution!) with a program based on Citra's video core. Citra team actually made something similar. It is called CiTrace, although it only records draw calls from the emulator, but as of any recent(and even quite old) commit Citra can no longer run CiTrace player and just outright crashes.
In conclusion: I hope that at least a few of my ideas made some sence. I'm interested in hearing from you guys how my ideas were wrong(or how much I suck at english) or what would be your idea for a better capture method on the Nintendo 3DS. (And I also hope that this is the right section of the forum to post this in)