I've been helping a friend of mine with his stereo recently and it got me energized to do some work on my own. First of all, I decided to finish a long-term project of mine that's been on hold since my daughter was born: assembling a turntable out of myriad spare parts, most of which I already had lying around.
The platter and mechanism came from a frustratingly crappy direct-drive linear-tracking turntable that I rescued from the trash. I bypassed all of the control circuitry and wired the power directly to the motor, so it only turns at 33 1/3 RPMs (i.e., no 45s for me), but it seems to be pretty solid and consistent at that, at least. I'm interested in trying some different platter materials at some point, but I doubt the motor has enough torque to handle anything much heavier than the one it came with.
The tonearm comes from a Technics SL-1950. I bought this one used off of eBay for about $50. I mounted it to some spare blocks of wood I had lying around and purchased some long, skinny nuts and bolts that I used to raise it to the appropriate height for the platter and get it leveled properly. I also used the wood to sandwich some female RCA jacks that I spliced onto the tonearm's own wires so that I could hook it up to my preamp using standard RCA cables (the yellow jack is the ground line).
It sounds really good and, despite looking ghetto af, it has a nice, post-apocalyptic DIY charm that that appeals to me. More importantly, though, this raw setup provides a direct line from the tonearm to the preamp/amplifier without any of the circuitry in the way that can lead to signal degradation in many more user-friendly turntables (like my LP-120 before I modded it for a direct connection, as well).
If you'd like to hear the output, it's the turntable I used to make the recordings for my cartridge comparison post.
Next up, I've been giving my Dared VP-20 tube amp a rest lately and am instead using a Lepai LP-2020TI tripath amplifier I purchased from Parts Express. It seems Lepai is no longer producing the original model, the LP-2020A+, which is a shame since it was so well-loved among audio enthusiasts, but they are making what is essentially a clone using a more easily sourced chipset (it pumps out a few extra watts, too, which is nice). It has the same clean, low-distortion sound as the original (as long as you keep the volume dial below about 11 o'clock, just like the original...) and, as much as I like my tube amp, the Lepai provides a clear accuracy that can be a refreshing change of pace from the folksy warmth of the tubes.
Finally, I overhauled my backloaded horns--which I originally fitted with some cheap but adequate drivers from MCM Electronics--with some really nice Tang-Band full-range drivers. Since the MCM drivers are only good to about 4 kHz, I had them set up on a 3-way crossover with some Bohlender Graebener Neo8 midranges. It sounded good, but I've always heard that full-range drivers running uncrossed sound more "realistic" than 2-/3-way setups.
So, I hooked the Tang-Bands up uncrossed with the ribbon tweeters wired in parallel (the Lepai can push the combined 4-ohm load just fine), and sure enough: they sound a lot brighter and more even than the crossed 3-way setup, likely due to the Tang-Band's hump above 15 kHz combined with the natural lack of sensitivity mismatching.
The Tang-Bands are supposed to be good down to 60 Hz, but they were barely usable down to probably 80-100 Hz or so (I suspect the chambers in my horn boxes just aren't large enough for the rated performance), so I definitely need my 15" subwoofer in the mix now (my old 3-way setup benefited from it, as well, but it wasn't strictly necessary). Likewise, the ribbon tweeters are supposed to be on high-pass filters for safety, but I'm pushing such a light load through them that I'm pretty sure they'll be okay.
Thursday, December 14, 2017
Phono Cartridge Comparison: At95e vs Ortofon Omega
I recently assembled a franken-turntable from bits and pieces I salvaged from other systems, but one of the pieces I needed to complete it is a new cartridge. I already had an At95e from Audio Technica, which is well known and well regarded among turntable aficionados as a solid budget cartridge, so I thought I would mix it up a bit and purchase another favorite from the budget realm, the Ortofon Omega. Both of these cartridges are available for around $35-40, so I figured it would be a good, fair fight. Reviews online tend to refer to the Omega as "warmer" than the At95e, which is known as a "sterile" but "accurate" performer.
For testing, I took two short recordings with each cartridge from the 'tape monitor' output of my preamp, which sends a full-output signal (i.e., bypassing all of the volume and EQ from the preamp aside from applying the Redbook boost to the phono stage) directly into the audio input of my Lenovo Thinkpad laptop. I used Steely Dan's Showbiz Kids (track 1 from side B of the first disc of Steely Dan's Greatest Hits double LP) and Dr. Dre's Nothin But a G Thang (coincidentally also track 1 from side B of the first disc of The Chronic 180g vinyl remaster). I didn't do any EQ/processing/declicking, just removed the dead space at the beginning and normalized both tracks to -1 dB because the At95e put out a slightly louder signal and we typically think louder things sound subjectively better.
You can download my test recordings here. (Note: these recordings are short and only include the intros of the songs, and I believe them to be covered by Fair Use). I did my testing using the Lacinato ABX audio testing software, which is free to use, listening via Audio Technica ATH-M50 closed-back headphones.
If you wish to remain impartial, please do your listening/testing before reading any further.
Ok, so first off, the results are extremely close. I think anyone would be very happy with either of these cartridges for the prices they typically go for. In simple, unblinded A/B testing, I feel like I was able to hear a consistent difference between them, mostly in the vocal midrange. The At95e sounds a little roomier while the Omega sounds very tight. Whether one would consider either to be better than the other probably depends on the words I use to describe them (e.g., roomy vs loose/sloppy, tight vs constricted). I honestly don't think I prefer one over the other, it's just a slightly different character.
In proper blinded ABX testing, that all fell apart and I wasn't able to reliably tell them apart. I did about a half-dozen comparisons and my best accuracy rate was around 67%, but more often I hovered around 50% and my confidence was always quite low.
So, there you have it. Both are good budget cartridges with no major differences as far as I could tell. I noticed no difference between them in "warmth" or "accuracy," whatever that means. You should probably take reviews claiming otherwise (e.g., suggesting one or the other is better for this or that kind of music, etc.) with a grain of salt. Feel free to leave a comment with your own ABX results.
Audio Technica's At95e |
The Ortofon Omega |
If you wish to remain impartial, please do your listening/testing before reading any further.
Ok, so first off, the results are extremely close. I think anyone would be very happy with either of these cartridges for the prices they typically go for. In simple, unblinded A/B testing, I feel like I was able to hear a consistent difference between them, mostly in the vocal midrange. The At95e sounds a little roomier while the Omega sounds very tight. Whether one would consider either to be better than the other probably depends on the words I use to describe them (e.g., roomy vs loose/sloppy, tight vs constricted). I honestly don't think I prefer one over the other, it's just a slightly different character.
In proper blinded ABX testing, that all fell apart and I wasn't able to reliably tell them apart. I did about a half-dozen comparisons and my best accuracy rate was around 67%, but more often I hovered around 50% and my confidence was always quite low.
So, there you have it. Both are good budget cartridges with no major differences as far as I could tell. I noticed no difference between them in "warmth" or "accuracy," whatever that means. You should probably take reviews claiming otherwise (e.g., suggesting one or the other is better for this or that kind of music, etc.) with a grain of salt. Feel free to leave a comment with your own ABX results.
Labels:
audio technica,
audiophile,
cartridge,
comparison,
ortofon,
turntable
Monday, October 30, 2017
N64 VI Filter
The N64's RDP chip includes a Video Interface (VI) stage that prepares the video for the final output. From the N64 Programming Manual:
So, you can see that the filter just barely touches the HUD elements but it does some pretty dramatic stuff to the rest of the image. It applies strong antialiasing to the outside edges of objects, which has a big, noticeable effect (so noticeable, you can see it in the thumbnail images) on Mario's hat and the silhouette of the tree, and it does some blurring that smooths out the dithering that is very visible in the unfiltered shot. On actual hardware, the blurring can be toggled off in some games (Quake II, for example, IIRC) or using Gameshark codes. I believe consoles modded with UltraHDMI or etim's N64RGB boards can also switch it off through the boards' firmwares.
The video interface reads the data out of the framebuffer in main memory and generates the composite, S-video, and RGB signals. The video interface also performs the second pass of the antialias algorithm. The video interface works in either NTSC or PAL mode, and can display 15- or 24-bit color pixels, with or without filtering, at both high and low resolutions. The video interface can also scale up a smaller image to fill the screen.These functions can make a very big impact on the final image of an N64 game, and the ParaLLEl-N64 libretro core exposes the ability to toggle the postprocessing effects of this stage on and off. Turning it off nets you a few frames per second of speed but also gives us a peek behind the VI curtain:
Filtered |
Unfiltered |
Wednesday, October 25, 2017
Using RetroArch via Snappy
I've been working with some folks on trying to get a snap package up and running for RetroArch to go with our FlatPak and AppImage universal linux packages, and it's turned out to be more complicated than I expected to navigate the particulars of the packaging format combined with the restrictions of the security sandboxing.
We announced the package a couple of weeks ago but quickly got reports that users couldn't load files, they were confused as to why the package took a long time to load (only on the first launch, but they didn't know that), and more. Since updating my laptop to Ubuntu 17.10, I decided to "dogfood" the snap package, since that would be the only way I could get in front of the reports and ensure a good experience.
Since RetroArch requires a lot of stuff to look nice and function properly, we use a wrapper script that checks for the existence of that stuff and, if it's not where we expect it, it copies the stuff into the snap's user directory. Since that copying can take a long time, I decided to use notify-send to include some admittedly uninformative notices just to let the user know that nothing is frozen/broken and that we're just copying stuff in the background. The catch here is that you can't use the system's notify-send, you have to include it in the snap as a runtime dependency, under the "stage-packages" in the snapcraft.yaml recipe, like this. I tried adding an icon to make the notifications prettier and more obviously RetroArch-related, but I could never get it to actually see the icon, no matter where I stored it, so I gave up on that.
Ok, so the notifications were a nice little improvement, but we still couldn't actually get to any files to launch them, which makes the program pretty useless. For that, we needed to add the "home" plug to the recipe, like this. This plug is supposed to be auto-connected, so you shouldn't need to do anything to make it accessible to your application once it's in the recipe. However, RetroArch's non-native file-browsing meant that it tried to start in /, which is inaccessible (and in fact, totally invisible) to the snap (and if your snap starts you in an inaccessible directory, you can't ever get out of it), so I added a line to my wrapper that pre-populates the user's retroarch.cfg config file with a line telling it to start in the home directory, where we should have access. I tried using $HOME and ~/, both of which just sent me to the snap's home directory instead of the true home directory with all the files -_-. The solution I found--which is pretty crummy but whatever--is to use a relative path that points to one level above the snap package. That is, ~/../../../
Similarly, I couldn't reach my network shares, which I mount in /media (despite adding any plug that seemed even vaguely related to such a thing to the recipe), so I had to move my mount points into my true home directory and use those same relative paths to everything, e.g. ~/../../../smbshare/Emulation/BIOS for my 'system' directory. Once the mount point is in my true home directory, I could probably put symlinks into the snap package, as well, to avoid the silly relative paths.
The last major issue I ran into was that the *.desktop launcher that shows up when you search for programs in the sidebar kept complaining about not having an "exec" line and then failing to launch because of it. This one was super-confusing because our snap has a *.desktop file (it lives in $SNAP/meta/gui), and that file definitely has an exec line. It turns out that, during installation, snapd generates the *.desktop file that the OS actually looks for and stores it in /var/lib/snapd/desktop/applications. This file is based on the *.desktop included with your program, but if the exec line isn't just like it expects, it will strip it out entirely and not give you any indication of why. Initially, our *.desktop file pointed the exec line to the retroarch.wrapper script that does so much work for us, but snapd didn't like this and rejected it until we switched it to just "Exec=retroarch", which isn't the name of the actual executable but rather the name of the snap itself. It still launched the wrapper script, since that's what our recipe points to, so we're all set.
Since we need to use our script when we launch from a command line, as well, we made sure to end it with $*, but this has a couple of problems that experienced scriptors will spot immediately. First, it's not escaped, so any spaces in file names will break it. Second, it will only accept a single argument, which isn't going to work for us. So, I changed it to "$@" and all is well.
Now, the only issues left that I know of are: 1.) the wrapper script has our nice invader icon, which shows in the sidebar while the script is running, but once it dies off, the icon goes with it and the actual program just has an ugly question-mark/no-icon-found icon in the sidebar and 2.) the snap can't load any dynamic libraries that live outside of its domain, so I can't conveniently compile a test core and then launch from command line to test it with the -L switch. #1 isn't a big deal and #2 probably isn't possible to fix, so it is what it is.
We announced the package a couple of weeks ago but quickly got reports that users couldn't load files, they were confused as to why the package took a long time to load (only on the first launch, but they didn't know that), and more. Since updating my laptop to Ubuntu 17.10, I decided to "dogfood" the snap package, since that would be the only way I could get in front of the reports and ensure a good experience.
Since RetroArch requires a lot of stuff to look nice and function properly, we use a wrapper script that checks for the existence of that stuff and, if it's not where we expect it, it copies the stuff into the snap's user directory. Since that copying can take a long time, I decided to use notify-send to include some admittedly uninformative notices just to let the user know that nothing is frozen/broken and that we're just copying stuff in the background. The catch here is that you can't use the system's notify-send, you have to include it in the snap as a runtime dependency, under the "stage-packages" in the snapcraft.yaml recipe, like this. I tried adding an icon to make the notifications prettier and more obviously RetroArch-related, but I could never get it to actually see the icon, no matter where I stored it, so I gave up on that.
Ok, so the notifications were a nice little improvement, but we still couldn't actually get to any files to launch them, which makes the program pretty useless. For that, we needed to add the "home" plug to the recipe, like this. This plug is supposed to be auto-connected, so you shouldn't need to do anything to make it accessible to your application once it's in the recipe. However, RetroArch's non-native file-browsing meant that it tried to start in /, which is inaccessible (and in fact, totally invisible) to the snap (and if your snap starts you in an inaccessible directory, you can't ever get out of it), so I added a line to my wrapper that pre-populates the user's retroarch.cfg config file with a line telling it to start in the home directory, where we should have access. I tried using $HOME and ~/, both of which just sent me to the snap's home directory instead of the true home directory with all the files -_-. The solution I found--which is pretty crummy but whatever--is to use a relative path that points to one level above the snap package. That is, ~/../../../
Similarly, I couldn't reach my network shares, which I mount in /media (despite adding any plug that seemed even vaguely related to such a thing to the recipe), so I had to move my mount points into my true home directory and use those same relative paths to everything, e.g. ~/../../../smbshare/Emulation/BIOS for my 'system' directory. Once the mount point is in my true home directory, I could probably put symlinks into the snap package, as well, to avoid the silly relative paths.
The last major issue I ran into was that the *.desktop launcher that shows up when you search for programs in the sidebar kept complaining about not having an "exec" line and then failing to launch because of it. This one was super-confusing because our snap has a *.desktop file (it lives in $SNAP/meta/gui), and that file definitely has an exec line. It turns out that, during installation, snapd generates the *.desktop file that the OS actually looks for and stores it in /var/lib/snapd/desktop/applications. This file is based on the *.desktop included with your program, but if the exec line isn't just like it expects, it will strip it out entirely and not give you any indication of why. Initially, our *.desktop file pointed the exec line to the retroarch.wrapper script that does so much work for us, but snapd didn't like this and rejected it until we switched it to just "Exec=retroarch", which isn't the name of the actual executable but rather the name of the snap itself. It still launched the wrapper script, since that's what our recipe points to, so we're all set.
Since we need to use our script when we launch from a command line, as well, we made sure to end it with $*, but this has a couple of problems that experienced scriptors will spot immediately. First, it's not escaped, so any spaces in file names will break it. Second, it will only accept a single argument, which isn't going to work for us. So, I changed it to "$@" and all is well.
Now, the only issues left that I know of are: 1.) the wrapper script has our nice invader icon, which shows in the sidebar while the script is running, but once it dies off, the icon goes with it and the actual program just has an ugly question-mark/no-icon-found icon in the sidebar and 2.) the snap can't load any dynamic libraries that live outside of its domain, so I can't conveniently compile a test core and then launch from command line to test it with the -L switch. #1 isn't a big deal and #2 probably isn't possible to fix, so it is what it is.
Saturday, October 21, 2017
Running Graphical Programs as Root in Wayland
Fix for Invalid MIT-MAGIC-COOKIE-1 keyCannot open display error when trying to use sudo.
I just updated to Ubuntu 17.04 (and it's a great release; I was on 16.04 LTS before and it's well-worth the update) and noticed that I kept getting the above error when I tried to elevate my privs to edit system files with a graphical text editor (typically gedit, but I've switched to pluma). That's a result of improved security measures in Wayland that we didn't have to worry about with the now-abandoned Unity desktop used in previous releases.
Most of the solutions floating around for this error refer to X sessions (usually X-forwarding over SSH) and don't actually do anything to correct the Wayland issue. However, I came across this solution, which worked a treat:
Step 1.) Create a new local directory to house a custom executable:
Step 6.) Permanently add our local script directory to our Path:
I just updated to Ubuntu 17.04 (and it's a great release; I was on 16.04 LTS before and it's well-worth the update) and noticed that I kept getting the above error when I tried to elevate my privs to edit system files with a graphical text editor (typically gedit, but I've switched to pluma). That's a result of improved security measures in Wayland that we didn't have to worry about with the now-abandoned Unity desktop used in previous releases.
Most of the solutions floating around for this error refer to X sessions (usually X-forwarding over SSH) and don't actually do anything to correct the Wayland issue. However, I came across this solution, which worked a treat:
Step 1.) Create a new local directory to house a custom executable:
mkdir ~/.local/bin/Step 2.) Next, we'll make a little script that will elevate the privileges for us when invoked (the OP called it 'wsudo', which seems like a good choice to me):
nano ~/.local/bin/wsudoStep 3.) Paste in these contents:
#!/bin/bashStep 4.) Make the script executable:
#small script to enable root access to x-windows systems
xhost +SI:localuser:root
sudo $1
#disable root access after application terminates
xhost -SI:localuser:root
#print access status to allow verification that root access was removed
xhost
chmod +x ~/.local/bin/wsudoStep 5.) Temporarily add our local script directory to our Path for easy access:
export PATH=$PATH:~/.local/binNow, you can take it for a spin and make sure it works as expected (by running, for example, wsudo gedit /testfile and making sure it saves okay). If everything is in order...
Step 6.) Permanently add our local script directory to our Path:
echo "export PATH=$PATH:~/.local/bin" >> ~/.bashrcThat's it. Now you should be able to invoke wsudo any time you need to run a GUI program with elevated privs.
Sunday, September 24, 2017
Padhacking a Terrible Genesis 6-button Controller
I recently got a model 1 Sega Genesis and an Everdrive MD and have been playing a lot of the great shmups and arcade ports. The standard 3-button pads are not great, period, but they're especially crummy for those sorts of games (Street Fighter is basically impossible), so I figured I'd seek out some 6-button pads.
Legit 6-button pads from Sega are quite nice, but they're getting more expensive these days (like everything retro, amirite?), so I decided to check out some of the cheap knockoffs. The cheapest ones I could find were going for $8 for 2 pads on eBay and, while I expected them to be shitty, they're worse than I ever imagined:
The buttons are so loose I was worried they would fall right out of the cheap plastic casing. The controllers themselves weigh almost nothing and their cord is a measly 3 feet long. The funniest quirk, IMO, is that they only used 4 screws to connect the housing instead of the 5 Sega used, but they put in a fake plastic screw just to keep up appearances:
Between the laughably short cord and the awful buttons, I decided to check out the PCB to see if it might be worth putting into an arcade stick (I already have a PS360+ multi-console board, which covers every console I care about except the Genesis/MD, so this would be useful). It turns out that the PCB is actually really great for this purpose, with a common-ground design and nice, big soldering pads for each input:
Here's a shot with wires soldered onto the pads:
And here's one with the solder joints smothered in hot glue for long-term stability:
I hooked it into an existing stick I had lying around and everything works perfectly. After the price of an extension cable, I'm still looking at sub-$10, so not too bad. I wouldn't recommend the Fighting Putt 6Bs for general use, but they're great for padhacking.
Legit 6-button pads from Sega are quite nice, but they're getting more expensive these days (like everything retro, amirite?), so I decided to check out some of the cheap knockoffs. The cheapest ones I could find were going for $8 for 2 pads on eBay and, while I expected them to be shitty, they're worse than I ever imagined:
The Fighting Putt 6B packaging. Both pads I received looked as if they'd been sat upon. |
Very clever, guys. Nobody suspects a thing. |
Here's a shot with wires soldered onto the pads:
And here's one with the solder joints smothered in hot glue for long-term stability:
I hooked it into an existing stick I had lying around and everything works perfectly. After the price of an extension cable, I'm still looking at sub-$10, so not too bad. I wouldn't recommend the Fighting Putt 6Bs for general use, but they're great for padhacking.
Labels:
arcade,
arcade stick,
fighter stick,
genesis,
mega drive,
pad hack
Tuesday, August 22, 2017
8bitdo NES30 Arcade Stick Review and Modding Info
I like to play retro games from my couch and I prefer using an arcade stick, but I don't want to have a giant cord stretching across my living room as a tripping hazard. The obvious solution is to get a Bluetooth arcade stick. The only problem: nobody makes them. It seems the people driving the arcade stick market are distrustful of wireless communication due to latency concerns, despite data from a very reputable source suggesting otherwise.
8bitdo had put out a couple of arcade sticks in the past, the FC30 and FC30 Sanwa Edition, but those sticks never got much traction, AFAICT, and they're long-since discontinued now (and I've never seen one come up on eBay). They've revisited the concept recently, though (presumably because their devices are compatible with the Nintendo Switch*, which got a Street Fighter 2 port, and no other company has released an arcade stick for that market), and released their NES30 Arcade stick, which I preordered as soon as I heard about it.
First impressions - Build Quality and Information
The plastic used for the main body of the box feels a little flimsy. It has some flex to it, which isn't encouraging, and there's a lot of empty space inside the stick, though this is actually a good thing when it comes time to start poking around in there. It has a nice, thick, solid metal base with recessed screws and built-in rubber feet, which is a big advantage in my opinion when compared with the flimsy, easily lost rubber feet from the Mad Catz TE and SE sticks (and once the feet were lost, the non-recessed screws would scrape up wooden surfaces and get caught on fabric -_-).
The buttons are knockoff Japanese-style and feel predictably crummy, but passable if you're just going to use it casually. The stick feels pretty decent, really, with none of the gravelly, scraping feelings characteristic of the Mad Catz SE sticks as they slowly ate themselves.
There are 8 full-size (i.e. 30 mm) buttons for A, B, X, Y, R1, L1, R2 and L2 in modern, staggered arcade stick layout, and a smaller button (presumably 24 mm) for Start. There are also smaller non-arcade-style buttons on the control panel for Select, Pair and Turbo. While Select is bindable in gaming software, the Turbo and Pair buttons are not exposed, leaving users with 10 buttons and a 4-way joystick. That is, there is no dedicated "home" button for assigning to "menu_toggle" in RetroArch/MAME.
Wireless connectivity over Bluetooth is quick and painless, and there's no obvious perceptible latency. If you want to play wired and/or charge the stick, 8bitdo has supplied a full-size USB-A-to-A cable, which is, frankly, bizarre.
Modding
The metal base is held onto the box by 6 small phillips-head screws. Once those are removed, you can pop the base off safely. That is, there is nothing attached to the base that can get yanked out, etc. Once inside, you can see that the wiring is clean and organized, with color-coded wires leading to plastic pin-headers on the board. You can also see the support structure (the hollow tubes surrounding the buttons), which provides a strong backbone where the stick will be seeing the most abuse.
The good news: swapping out the buttons on this stick is a breeze. The stock buttons snap right out and the .110 quick-disconnects transfer over to Sanwa buttons, which are a perfect fit (I swapped in Sanwa 30 mm OBSFS buttons), with no trouble. The stick also has mounting screws that line up perfectly with a Sanwa JLF stick.
The bad news:THIS IS NOT A COMMON-GROUND PCB. That's not a big deal with the buttons (unless you just really like to daisy-chain grounds for tidiness), but it's a very big problem for the stick, since Sanwa and Seimitsu sticks use a common-ground PCB for their switches. In short, this stick is INCOMPATIBLE with Japanese-style sticks without doing some significant modification. Correction: I was totally wrong about this. It is indeed common-ground, and you can just twist up all of the grounds (looking down on the inside of the stick, it's all of the top wires from each signal+ground pair) and attach them to your stick's common ground pin. Dunno what I was doing wrong when I first tried it, but I just now wired up a Seimitsu LS-56 with no issues. So, false alarm.
Speaking of the stick, it has a clip-in square restrictor plate/gate and has the control wires soldered directly to Lema microswitches, from Chinese company Zhejiang Lema Electrics Co. Ltd:
Since they were directly soldered, I needed to cut the wires, making this the first destructive modification so far.
The Lema switches are pretty close in size and shape to the tough-as-nails Cherry microswitches you would find in Happ/IL sticks and buttons, and I decided to swap them out for some I had in an old Happ Competition joystick.
The result is satisfyingly clicky and extremely light (that is, there's barely enough resistance to bring the stick back to center; some people will despise this). I was able to pull off 360/720-degree motions easily and reliably, but I'm not 100% convinced that I want to stick with this setup permanently, so I used insulated alligator clips rather than soldering .187 quick-disconnects to the wires in case I decide to swap it out with other switches in the future.
The restrictor plate/gate is held in by 4 little screws and 4 clips. Once the screws are out, the clips are nice and easy to manipulate, unlike the ones on JLF sticks, which are notoriously difficult to work with. I didn't check to see whether Sanwa plates would snap in, but it looks pretty likely. I might swap in an octo-gate at some point and will update this post if I run into any issues. [Update 10/27/2017: by request, I swapped in the Sanwa octo-gate from one of my other sticks and it fit just fine. The Sanwa plate is thicker than the stock plate, so I really had to cram it to get the clips to snap into place, but otherwise, it's no problem.] Here you can see the Cherry microswitches fit in nice and snug under the stock plate:
I didn't get around to testing it, but I suspect the longer, screw-down Happ/IL American-style buttons would fit just fine in the case, since it seems to be a little taller than the Mad Catz SE boxes, which were only about a quarter of an inch too short to fit them comfortably. [Update 10/27/2017: I was wrong, they're almost exactly the same height as the SE box, so the long-stem Happ/IL buttons won't fit, but the short-stem ILs will.]
*Note, the wired vs wireless issue seems to actually be in favor of wireless on the Switch, oddly enough: https://www.youtube.com/watch?v=avvmck40cIw
8bitdo had put out a couple of arcade sticks in the past, the FC30 and FC30 Sanwa Edition, but those sticks never got much traction, AFAICT, and they're long-since discontinued now (and I've never seen one come up on eBay). They've revisited the concept recently, though (presumably because their devices are compatible with the Nintendo Switch*, which got a Street Fighter 2 port, and no other company has released an arcade stick for that market), and released their NES30 Arcade stick, which I preordered as soon as I heard about it.
First impressions - Build Quality and Information
The plastic used for the main body of the box feels a little flimsy. It has some flex to it, which isn't encouraging, and there's a lot of empty space inside the stick, though this is actually a good thing when it comes time to start poking around in there. It has a nice, thick, solid metal base with recessed screws and built-in rubber feet, which is a big advantage in my opinion when compared with the flimsy, easily lost rubber feet from the Mad Catz TE and SE sticks (and once the feet were lost, the non-recessed screws would scrape up wooden surfaces and get caught on fabric -_-).
The buttons are knockoff Japanese-style and feel predictably crummy, but passable if you're just going to use it casually. The stick feels pretty decent, really, with none of the gravelly, scraping feelings characteristic of the Mad Catz SE sticks as they slowly ate themselves.
There are 8 full-size (i.e. 30 mm) buttons for A, B, X, Y, R1, L1, R2 and L2 in modern, staggered arcade stick layout, and a smaller button (presumably 24 mm) for Start. There are also smaller non-arcade-style buttons on the control panel for Select, Pair and Turbo. While Select is bindable in gaming software, the Turbo and Pair buttons are not exposed, leaving users with 10 buttons and a 4-way joystick. That is, there is no dedicated "home" button for assigning to "menu_toggle" in RetroArch/MAME.
Wireless connectivity over Bluetooth is quick and painless, and there's no obvious perceptible latency. If you want to play wired and/or charge the stick, 8bitdo has supplied a full-size USB-A-to-A cable, which is, frankly, bizarre.
Modding
The metal base is held onto the box by 6 small phillips-head screws. Once those are removed, you can pop the base off safely. That is, there is nothing attached to the base that can get yanked out, etc. Once inside, you can see that the wiring is clean and organized, with color-coded wires leading to plastic pin-headers on the board. You can also see the support structure (the hollow tubes surrounding the buttons), which provides a strong backbone where the stick will be seeing the most abuse.
A shot of the insides before I got started on it. |
The bad news:
Speaking of the stick, it has a clip-in square restrictor plate/gate and has the control wires soldered directly to Lema microswitches, from Chinese company Zhejiang Lema Electrics Co. Ltd:
Since they were directly soldered, I needed to cut the wires, making this the first destructive modification so far.
The Lema switches are pretty close in size and shape to the tough-as-nails Cherry microswitches you would find in Happ/IL sticks and buttons, and I decided to swap them out for some I had in an old Happ Competition joystick.
The result is satisfyingly clicky and extremely light (that is, there's barely enough resistance to bring the stick back to center; some people will despise this). I was able to pull off 360/720-degree motions easily and reliably, but I'm not 100% convinced that I want to stick with this setup permanently, so I used insulated alligator clips rather than soldering .187 quick-disconnects to the wires in case I decide to swap it out with other switches in the future.
The extra-roomy case came in handy here for holding my insulated alligator clips |
I didn't get around to testing it, but I suspect the longer, screw-down Happ/IL American-style buttons would fit just fine in the case, since it seems to be a little taller than the Mad Catz SE boxes, which were only about a quarter of an inch too short to fit them comfortably. [Update 10/27/2017: I was wrong, they're almost exactly the same height as the SE box, so the long-stem Happ/IL buttons won't fit, but the short-stem ILs will.]
*Note, the wired vs wireless issue seems to actually be in favor of wireless on the Switch, oddly enough: https://www.youtube.com/watch?v=avvmck40cIw
Labels:
8bitdo,
arcade,
cherry microswitch,
fighter stick,
happ,
sanwa,
street fighter
Friday, July 7, 2017
RetroArch shaders ported to ReShade
I've frequently gotten requests to port RetroArch's library of shaders to ReShade's format for use with other programs/games, but I'm a Linux guy for the most part, so I never had the inclination to do such a thing. Thankfully, ReShade user Matsilagi is/was so inclined and he ported a bunch of them and made them available in this github repo:
I haven't tested all of them, but the ones I've seen look perfect, so they should be ready to go with all of the low-fi CRT/NTSC/PAL goodness.
Artifact Colors |
CRT-Geom and CRT-Lottes |
PAL |
NTSC |
Thursday, June 22, 2017
RetroArch Tone-mapping LUT Shader
Reshade has long had a shader, LUT.fx, that enables users/designers to do tonemapping and other color adjustments without touching any code. Instead, they can do all of their adjustments in an image editing program, such as Photoshop or GIMP, which many people are familiar with already. While my image-adjustment and color-mangler shaders can be used to accomplish those same tasks (Pokefan531 did a great job modifying them to produce his handheld color shaders, after all), they're awkward to work with, since an artist has to make all of his/her changes by twiddling esoteric values in the shader settings menu.
So, I decided to port the Reshade shader to RetroArch so our users could get in on the fun. I ran into some unexpected behavior with the direct port, though, and decided to adapt another similar shader instead. This one ended up having the same weird issue, which I think is related to undefined behavior, but I put a stupid workaround in the shader to mostly deal with it.
Anyway, here's how you use it:
First, take a screenshot that you want to use as your reference (I'm going to use the Sonic the Hedgehog title screen) and then take one of the passthrough palette textures that come with the shader (they're the png files located in reshade/shaders/LUT, named for their color depth). Then, in Photoshop or GIMP or Paint.NET or whatever, open your reference screenshot and paste the palette image down below it:
I find the easiest way to do this is to go to the image menu > canvas size and then anchor it from the top and increase the canvas Y-axis measurement by the height of the palette texture (in my example, I'm using the '16' texture, so I made the image 16 px taller). Paste the palette image in there, move it to the bottom-left corner and then merge the palette layer with the screenshot layer.
Next, do whatever it is you need to do to make the picture look like you want it. That includes brightness/contrast, hue/saturation, indexed color, different lossy colorspace conversions (such as by switching to CMYK colorspace and then back to RGB). I'm going to do a simple hue shift in my example because it's easy to spot:
Once you have it all set, we need to isolate the palette from the screenshot. I think the easiest way to do this is to go back to image > canvas size, anchor from the bottom left and enter the size of the palette (the width of the passthrough palettes is the height squared; I used the 16-bit palette, so it's 16 * 16 = 256 px). Save that image under a new name (I called mine 'hedgehog-palette.png'; if you're using a different palette depth from the default 16, it's probably a good idea to put that into the filename somewhere so you don't forget it) and then drop it into your reshade/shaders/LUT directory with the other palette images.
Now, open the LUT shader preset (cgp/glslp/slangp file) in a text editor and change the line that points to the palette (probably line 7, but YMMV):
SamplerLUT = shaders/LUT/16.png
becomes
SamplerLUT = shaders/LUT/hedgehog-palette.png
Save and exit and then fire up RetroArch, load a core and some content and then load up the shader. It should apply those same color transformations to the game image, like so:
If you used a different bit-depth palette from the default 16, your colors may look crazy and messed up at first, in which case you need to go back into the quick menu > preview shader changes and then change the "LUT Size" runtime parameter to match.
This shader is available from the online updater and/or git in GLSL, Cg and slang shader formats.
So, I decided to port the Reshade shader to RetroArch so our users could get in on the fun. I ran into some unexpected behavior with the direct port, though, and decided to adapt another similar shader instead. This one ended up having the same weird issue, which I think is related to undefined behavior, but I put a stupid workaround in the shader to mostly deal with it.
Anyway, here's how you use it:
First, take a screenshot that you want to use as your reference (I'm going to use the Sonic the Hedgehog title screen) and then take one of the passthrough palette textures that come with the shader (they're the png files located in reshade/shaders/LUT, named for their color depth). Then, in Photoshop or GIMP or Paint.NET or whatever, open your reference screenshot and paste the palette image down below it:
I find the easiest way to do this is to go to the image menu > canvas size and then anchor it from the top and increase the canvas Y-axis measurement by the height of the palette texture (in my example, I'm using the '16' texture, so I made the image 16 px taller). Paste the palette image in there, move it to the bottom-left corner and then merge the palette layer with the screenshot layer.
Next, do whatever it is you need to do to make the picture look like you want it. That includes brightness/contrast, hue/saturation, indexed color, different lossy colorspace conversions (such as by switching to CMYK colorspace and then back to RGB). I'm going to do a simple hue shift in my example because it's easy to spot:
Once you have it all set, we need to isolate the palette from the screenshot. I think the easiest way to do this is to go back to image > canvas size, anchor from the bottom left and enter the size of the palette (the width of the passthrough palettes is the height squared; I used the 16-bit palette, so it's 16 * 16 = 256 px). Save that image under a new name (I called mine 'hedgehog-palette.png'; if you're using a different palette depth from the default 16, it's probably a good idea to put that into the filename somewhere so you don't forget it) and then drop it into your reshade/shaders/LUT directory with the other palette images.
Now, open the LUT shader preset (cgp/glslp/slangp file) in a text editor and change the line that points to the palette (probably line 7, but YMMV):
SamplerLUT = shaders/LUT/16.png
becomes
SamplerLUT = shaders/LUT/hedgehog-palette.png
Save and exit and then fire up RetroArch, load a core and some content and then load up the shader. It should apply those same color transformations to the game image, like so:
If you used a different bit-depth palette from the default 16, your colors may look crazy and messed up at first, in which case you need to go back into the quick menu > preview shader changes and then change the "LUT Size" runtime parameter to match.
This shader is available from the online updater and/or git in GLSL, Cg and slang shader formats.
Labels:
lut,
photoshop,
pixel shader,
RetroArch,
tonemapping
Monday, June 19, 2017
RetroArch shaders on Shovel Knight
I recently played through the awesome retro-styled platformer Shovel Knight for the first time and, while the pixel art is incredible, I kept thinking "I bet this would look even more amazing on a CRT." The game runs at a higher resolution on PC, so getting the low-res scanline-y look I crave wouldn't really be possible (or at least would be a significant hassle) through a 31 khz PC monitor. However, thanks to j_selby's out-of-the-blue Citra-libretro core, we can now run 3DS games via RetroArch--including Shovel Knight--and apply all sorts of fun shaders to the output!
Here are a couple of shots (click to embiggen):
The old favorite, cgwg's CRT-Geom |
xBR-lvl2 |
artifact-colors; horizontal scaling is a little wonky |
crtglow-gaussian |
ntsc-320px-gaussian-scanlines |
Labels:
3DS,
citra,
pixel art,
pixel shader,
RetroArch,
shovel knight
Thursday, June 1, 2017
New NTSC Shaders
NTSC simulation/emulation is a tough nut to crack. There's a lot of math involved, and the results are very dependent on games/content having the correct resolution or else the effect falls apart. Themaister's NTSC shader does a fantastic job with both 256- and 320-pixel-wide content, which covers most modern-ish retro consoles, including S/NES, Genesis, PS1 and N64, and it handles content of arbitrary resolutions pretty well. Nevertheless, it's always good to have variety, so I was excited to find some other shaders that included different takes on the NTSC signal problem.
Artifact Colors
This shader is based on Flyguy's awesome "Apple II-Like Artifact Colors" shadertoy and is the most impressive/magical of the shaders I'm going to cover here. Where this shader excels is in reproducing the NTSC "artifact colors" that certain old computers depended on before full-color interfaces were a thing. You can find some great explanations of the phenomenon at the 8088 mph demo writeup and this post at nerdlypleasures.
This splitscreen image demonstrates just how mind-boggling this effect can be, taking a 1-bit black and white image and ending up with bright colors:
And here's an animated gif that cycles between the typical limited-palette RGB output and the full-color artifact-color version (half RGB and half composite artifact back and forth):
In porting this shader, I tried to make runtime parameters out of as many variables as possible because it's such a cool thing to see how they affect the emergent colors in real-time. One such parameter is the F_COL variable, which can be used to basically have another color palette that you can switch to.
I also made a preset that pairs it with T. Lottes' CRT shader, which is perfectly suited to CGA content:
I made the presets force a 640-pixel width, since that seems to be the sweet spot for this shader, which isn't surprising seeing as many of the games that relied on artifact coloring used a 640x200 mode. I'm not very familiar with any of the classic computer cores that could take advantage of this shader, but I'd love to hear comments from anyone who gives them a shot.
This preset also looks pretty darn good for general use, though it does a strange green/magenta strobe thing on Sonic's waterfalls and probably other pseudo-transparency:
CRTSim
This one from J. Kyle Pittman / PirateHearts is similar to the CRT effect in You Have To Win the Game:
It doesn't try to be accurate, it just looks nice and runs fast. While it does the whole bevy of old TV effects, including a really nice per-channel color persistence that can be used to reproduce the characteristic red smear of a failing CRT, my favorite part of it is the LUT-based NTSC effect. It uses a simple texture of diagonal colored lines:
and mixes it in with the main image based on differences in brightness between each pixel and its neighbors. For such a simple concept, the result is very convincing, and it even does a reasonable job of "passing" the NTSC crosstalk test-ROM (with certain settings):
Also of interest with this shader is the choice of permissive public domain licensing, so people can integrate it into their games and other programs without fear of licensing conflicts.
Both of these shaders are available in RetroArch's regular GLSL and slang/Vulkan formats. Artifact Colors is also available in RetroArch's Cg format and for ReShade, courtesy of user Matsilagi, who has also ported a number of other RetroArch shaders to ReShade. On-topic but not pictured are two NTSC shaders derived from MAME's NTSC shader implementation, one multipass and one single-pass. I haven't gotten them to work properly in GLSL format, only slang, and all of my Vulkan screenshots have their red and blue channels swapped for some reason right now, so I couldn't share any shots of them. You can get a taste from their shadertoy implementation, though.
Artifact Colors
This shader is based on Flyguy's awesome "Apple II-Like Artifact Colors" shadertoy and is the most impressive/magical of the shaders I'm going to cover here. Where this shader excels is in reproducing the NTSC "artifact colors" that certain old computers depended on before full-color interfaces were a thing. You can find some great explanations of the phenomenon at the 8088 mph demo writeup and this post at nerdlypleasures.
This splitscreen image demonstrates just how mind-boggling this effect can be, taking a 1-bit black and white image and ending up with bright colors:
RGB output on the left, composite artifact output on the right. Those colors are all generated by the signal modulation :O |
Notice how the magenta and black stripes turn to a rich brown |
I also made a preset that pairs it with T. Lottes' CRT shader, which is perfectly suited to CGA content:
I called this one c64-monitor in the 'presets' subdirectory of the shader repos |
This preset also looks pretty darn good for general use, though it does a strange green/magenta strobe thing on Sonic's waterfalls and probably other pseudo-transparency:
CRTSim
This one from J. Kyle Pittman / PirateHearts is similar to the CRT effect in You Have To Win the Game:
It doesn't try to be accurate, it just looks nice and runs fast. While it does the whole bevy of old TV effects, including a really nice per-channel color persistence that can be used to reproduce the characteristic red smear of a failing CRT, my favorite part of it is the LUT-based NTSC effect. It uses a simple texture of diagonal colored lines:
This looks gray, but it's actually red, green and blue diagonal lines |
Also of interest with this shader is the choice of permissive public domain licensing, so people can integrate it into their games and other programs without fear of licensing conflicts.
Both of these shaders are available in RetroArch's regular GLSL and slang/Vulkan formats. Artifact Colors is also available in RetroArch's Cg format and for ReShade, courtesy of user Matsilagi, who has also ported a number of other RetroArch shaders to ReShade. On-topic but not pictured are two NTSC shaders derived from MAME's NTSC shader implementation, one multipass and one single-pass. I haven't gotten them to work properly in GLSL format, only slang, and all of my Vulkan screenshots have their red and blue channels swapped for some reason right now, so I couldn't share any shots of them. You can get a taste from their shadertoy implementation, though.
Wednesday, April 5, 2017
Upscaling shaders for pre-rendered backgrounds
In the Playstation 1 / Nintendo 64 era, it was a common practice to pair low-poly 3D models with prerendered photo-realistic 2D backgrounds to give the illusion of a fully 3D-rendered game. This technique was used extensively in the Resident Evil series and Squaresoft RPGs (Final Fantasy 7, 8 and 9 and Chrono Cross), and when we emulate those games, we run into some issues related to the background.
If we use an emulator that supports increased internal resolution, the 3D polygon elements look sharp and crispy but the prerendered scenes are super-pixelated, and the contrast between the two can be jarring and distracting:
To avoid that, you can stick to the native resolution and let everything look consistently pixelated and then use a full-screen post-processing shader remove some of the rough edges. Unfortunately, many of the algorithms we use for upscaling representational, cartoony pixel art don't always do a good job with the prerendered scenes, as the subtle gradients and natural, random dithering can trip up the pixel comparison and pattern detection methods that work so well in other applications:
There are some good upscaling algorithms that are designed to deal with this sort of application, though, and we'll take a look at some of the ones available in shader form. We'll be comparing 4x upscales from two native-res captures from Final Fantasy 7:
ScaleFX-Hybrid
Sp00kyFox's scaleFX algorithm does a stellar job on cartoony stuff, and he tweaked the algorithm a bit to better handle small details. It has a rough, stipple texture on edges, unlike the regular version, and maintains details that the regular version smooths away:
It also handles gradients better than the regular scalefx, with reduced posterization.
NNEDI3
This algorithm, which utilizes a "predictor neural network architecture and local neighborhood pre-processing" (whatever that means), was recently ported to RetroArch's slang shader format by a fellow that goes by the name ZironZ. He hardcoded the huge swaths of weights derived from the neural network into the shader passes, which makes for relatively long compile times but respectable speeds once the task is done. He provides a handful of presets, some of which are much too demanding for my modest GPUs, but the basic preset runs in real-time and looks very nice:
This algorithm compares well with the popular waifu2x algorithm, which is also neural net-based but cannot run anywhere close to real-time (these shots are using the 'photo' preset with 'medium' denoising):
While all upscalers have a "signature" in the way their output looks (characteristic swirls or burrs on objects, etc.), much of waifu2x's magic mojo comes from its denoising routine, which gives images a surreal, painterly look when maxed out:
We can't reproduce that look exactly with real-time shaders, but we can get closer by adding a pre-pass of bilateral blur, which denoises the image by blending neighboring pixels that are close in color/intensity.
Fast-bilateral-NEDI
This combines Xin Li et al's NEDI algorithm (not to be confused with the aforementioned NNEDI3 from tritical) with a fast-bilateral pre-pass to get some of that smooth, surreal look at the cost of lost detail:
The fast-bilateral shader includes a runtime parameter to control the strength of the blending, and I modified it to go all the way up to 2.0 (default value is 0.4 with a max of 1.0), which looks nice but nearly wipes out the tile mosaic:
Fast-bilateral-super-xBR
Hyllian's xBR algorithm is now legendary in emulation circles for its ability to upscale pixel art, and the super-xBR variant is tuned to handle images and photos. The pre-pass of fast-bilateral cleans up some of the images' own built-in noise:
Again, cranking up the bilateral-blur to to 2.0 has an interesting if not useful effect:
And, just for fun/comparison, here's what waifu2x looks like in "artwork" mode (that is, a mode that throws away even more detail) with the denoising at maximum:
Hyllian's 3D detection shaders
Another alternative strategy involves using an initial shader pass to identify which elements are upscaled 3D models vs which ones are 2D textures. This lets the shader work on HUD elements and backgrounds without getting thrown off by the increased internal resolution:
If we use an emulator that supports increased internal resolution, the 3D polygon elements look sharp and crispy but the prerendered scenes are super-pixelated, and the contrast between the two can be jarring and distracting:
An unsettling juxtaposition. |
Only a few pixels get smoothed; most are skipped |
Sp00kyFox's scaleFX algorithm does a stellar job on cartoony stuff, and he tweaked the algorithm a bit to better handle small details. It has a rough, stipple texture on edges, unlike the regular version, and maintains details that the regular version smooths away:
It also handles gradients better than the regular scalefx, with reduced posterization.
NNEDI3
This algorithm, which utilizes a "predictor neural network architecture and local neighborhood pre-processing" (whatever that means), was recently ported to RetroArch's slang shader format by a fellow that goes by the name ZironZ. He hardcoded the huge swaths of weights derived from the neural network into the shader passes, which makes for relatively long compile times but respectable speeds once the task is done. He provides a handful of presets, some of which are much too demanding for my modest GPUs, but the basic preset runs in real-time and looks very nice:
This algorithm compares well with the popular waifu2x algorithm, which is also neural net-based but cannot run anywhere close to real-time (these shots are using the 'photo' preset with 'medium' denoising):
waifu2x, photo, medium denoise; no shader form available |
waifu2x, photo, medium denoise; no shader form available |
waifu2x, photo, max denoise; no shader form available |
waifu2x, photo, max denoise; no shader form available |
Fast-bilateral-NEDI
This combines Xin Li et al's NEDI algorithm (not to be confused with the aforementioned NNEDI3 from tritical) with a fast-bilateral pre-pass to get some of that smooth, surreal look at the cost of lost detail:
The fast-bilateral shader includes a runtime parameter to control the strength of the blending, and I modified it to go all the way up to 2.0 (default value is 0.4 with a max of 1.0), which looks nice but nearly wipes out the tile mosaic:
Fast-bilateral-super-xBR
Hyllian's xBR algorithm is now legendary in emulation circles for its ability to upscale pixel art, and the super-xBR variant is tuned to handle images and photos. The pre-pass of fast-bilateral cleans up some of the images' own built-in noise:
Again, cranking up the bilateral-blur to to 2.0 has an interesting if not useful effect:
And, just for fun/comparison, here's what waifu2x looks like in "artwork" mode (that is, a mode that throws away even more detail) with the denoising at maximum:
waifu2x, artwork, max denoise; no shader available |
waifu2x, artwork, max denoise; no shader available |
Another alternative strategy involves using an initial shader pass to identify which elements are upscaled 3D models vs which ones are 2D textures. This lets the shader work on HUD elements and backgrounds without getting thrown off by the increased internal resolution:
super-2xbr-3d-6p-smoother (I think) |
jinc2-sharper-3d (I think) |
Subscribe to:
Posts (Atom)