How I Cheesed the VOLLEYBALL Challenge in Super Mario Odyssey

How I Cheesed the VOLLEYBALL Challenge in Super Mario Odyssey
ArticlesBlog


After cheesing the Jump Rope challenge in
Super Mario Odyssey with a Python script and a microcontroller and finding myself at the
top of the leaderboard for a few weeks, the next logical step was to see if the Volleyball
Challenge could also be cheesed. Volleyball is a lot more complex than jump
rope, with a lot more variables involved, so this was obviously going to be a much more
difficult challenge. First, though, let’s take a look at some
other methods for cheesing the Volleyball challenge. If you’re just playing the game casually
and all you care about is clearing the challenge, the best pro tip for that is, as I’m sure
everyone’s seen, to play the game in two player mode, and exclusively use Cappy, while
Mario just stands off to the side. The problem with using Mario for the Volleyball
challenge is that his acceleration curve is so slow, it’s really hard to get him to
the ball in time. In FEZ PEZ’s video on the Talkatoo glitch,
he proposed that you could try to freeze Mario in place above the volleyball snail and rally
the ball back that way. This strat works for a little a while, but
unfortunately, Mario is just one little guy and the ball has such variance in its angle
and such that he simply can’t cover the entire space above the net. So far as I know, no one has successfully
gotten a high score with this method. I also found this video on Twitter of someone
using some advanced Cappy throws to theoretically cheese the Volleyball. And this technique seems to have some promise,
but unfortunately if we wanted to automate the process, short of building a robot, we
don’t have any way that I know of to programmatically emulate motion controls on the Switch. So then, the plan is to use a similar technique
to the Jump Rope bot—use the Switch Fightstick library and a Teensy 2.0++ to emulate a Switch
controller, and have that microcontroller communicate with a Python script which is
using the video feed from the game to determine what button input to send back to the microcontroller. For Jump Rope, this was pretty easy—all
we really cared about was the timing, so my solution was to look at the score display
as a way to synchronize the timing, and all we really had to do was send a signal to press
the jump button with the appropriate timing. Volleyball, on the other hand, is a much more
complex beast. You’ve got to somehow track the position
of the player and the ball, then move the player into position to hit the ball, and
do it all in real time without ever screwing up for hours on end. So, let’s tackle each issue one by one. First off, just like when playing casually,
Mario moves too slowly to get to the ball in time, so our bot is going to be controlling
Cappy. So then, the two things we have to track are
Cappy and the ball. Like with the Jump Rope Bot, we’re going
to be using the OpenCV library in Python for image processing. There are lots of different ways to track
objects in OpenCV. You could use dedicated motion trackers or
some methods for finding circles in an image, but pretty early on, even before I made the
Jump Rope bot, I figured out a fast, simple, and reliable method for tracking these objects. First, let’s crop the video, then split
it into its red, blue, and green components. Notice anything? Cappy is by far the least red part of the
image—the sand and the ball are both red, so when we look at the red channel, Cappy
sticks out as a dark spot. That Luigi costume’s finally paying off,
huh? Similarly, on the green channel, the ball
sticks out as a dark spot. So, to track them, we just look for darkest
spots in the red and green channels. What if the ball isn’t in the court, you
might ask? In that case the darkest part in the green
channel ends up being Cappy’s shadow, which isn’t really a problem. Also, sometimes it picks up the ball’s shadow
instead of the ball itself, but that’s also not really a problem, because so long as Cappy
follows one or the other, they’ll converge when the ball reaches the ground. Now, for communicating between the script
and the microcontroller—I pretty much glossed over that detail in the Jump Rope video, but
we can go into a bit more detail now. Basically, I use a USB to UART adapter to
send data from the computer to the Teensy. Mario Odyssey has such simple controls, I
was able to fit everything you need in just one byte. And now, we can play Mario Odyssey with a
keyboard, or, more importantly, with a Python script. So theoretically, putting these two pieces
together should be pretty straightforward. Just track Cappy, track the ball, and make
Cappy move in the balls general direction. Hopefully it should hit it, right? Well, if we try that… okay, okay, looks
like it works pretty well. It’s getting a half-decent score. And… wait, Cappy, where are you going? Cappy, come back! Okay… looks like there’s a bit more tweaking
to be done. Although Cappy is much more responsive than
Mario, he’s also a lot bouncier. Every time he hits the ball, he gets sent
flying off in some direction. If he gets sent off the court, then, well,
he’s lost forever. In fact, I even cheated a bit in that previous
video, manually steering Cappy back into the court whenever he tried to leave. So, to try to correct for that, we can extend
the bounds of the area we look for Cappy in, and add some extra rules that aggressively
guide Cappy back onto the court whenever he strays. These changes help, but there’s still a
problem. Cappy sort of flies around all over the place,
without really looking like he’s heading directly towards the ball. You can see how he likes to sort of orbit
around Mario. Sooo, to try to combat that, I finally went
and added full analogue control, so Cappy should theoretically head, more or less, directly
towards the ball. There’s just one more problem, though. Sometimes, Cappy will find himself on one
side of the court, while the ball goes towards the other. The natural instinct for a human player is
to return to the center of the court whenever they successfully hit the ball to avoid this
problem. So how do we get Volleybot to know when it
hits the ball? For that, we can turn to the Jump Rope bot! The score display changes when and only when
Cappy connects with the ball, so we watch for that to change, and when it does, we reset
Cappy to the center of the court. And that almost works. Buuut… there’s some unavoidable (so far
as I know) latency inherent in this setup, between the capture card, the script, the
microcontroller… there’s some delay between the script deciding what to do and it actually
seeing and processing the results of its own actions. For the most part this hasn’t really mattered
much, but watch this. Cappy starts in the center of the court like
we told him to, and the ball enters the court, so Cappy starts to move towards it. But then he overshoots and has to course correct. He ends up travelling farther than he actually
had to to reach the ball and eventually drops it. Soooo, rather than having him reset to the
center of the court, let’s just have him reset to the left side, near where the ball
enters. Problem solved. And FINALLY, we’re ready to see the Volleybot
in action. I was so excited to show off my creation to
the world as soon as possible, for the first time ever I set up a YouTube livestream to
see how far it could go. We had about 90 people concurrently at the
peak all sitting in the chat, talking about video games and stuff and watching the number
go up. And oh boy did that number go up. Although I was worried it would break at any
moment, it seemed to hold steady well up into the thousands and my confidence grew. But then, after about four hours and over
seven thousand successful volleys— “…tried to maintain media silence on this
game before I finished it. Oh what? Oh no, what happened?” —Cappy dropped the ball! Literally. I mean, look at that, he doesn’t even try! What’s up with that? Well, upon looking at the frame-by-frame replay,
it’s pretty apparent what happened. For a few frames, he starts to head towards
the ball, but then gives up and returns to his home position. Something in the score display detection code
is misfiring in a way that didn’t affect the Jump Rope bot. If we take a closer look at the score display
itself, and specifically if we process it in the same way the script does by reducing
it to just two colors, we see that there’s actually a good deal of noise in the signal. Now, the script is tolerant of some noise,
but apparently, for whatever reason, around 7138 the noise got so bad that it just so
happened to trigger the reset behavior. So, even though I was perfectly happy with
a score of over seven thousand—it was number 10 on the global leaderboard—knowing that
I could make it better was enough to keep me motivated. So, I spent probably more time than I should
have making a bunch of changes, some of which probably helped and some of which probably
didn’t—most importantly, I added some noise correction to the score display detector—and
I started another livestream to see how far this new and improved version could go. I also added a bunch of debugging views to
the stream, partly because of a viewer request and partly so that if it did eventually fail
we would know what happened. And—oh come on, it didn’t even get as
far? Well, at least it’s clear what happened
this time… watch closely, the script lags! You can see the video feed stutter for just
a moment, but it’s enough to screw everything up. At this point, it was midnight, I was tired,
I didn’t want to write any more code, but I still wanted that number to keep getting
higher. So, I said forget debug information, and forget
video quality standards, we’re going to 2006-era webcam Let’s Play, aww yeah. “There. This is a watchable stream.” I closed everything on my computer except
for Python running at REALTIME priority, set up this stupid webcam, didn’t even bother
changing any of the settings to make it watchable, then went to bed. I had some pretty weird dreams, but while
I was sleeping the script kept running, and when I woke up it was pretty darn successful. I finally went into the webcam settings to
try to make something watchable, buuut… it didn’t last much longer after that. With a score of 18,395, I was number four
on the leaderboard, just short of the number three slot, which is a pretty decent score
if I do say so myself. And, well, this is the end of the road for
now. With no debug information to go off of, my
patience for trying new things and then sitting around for 9 hours to see if the bot sets
a new record or not only lasts for so long. Maybe I’ll come back to Volleybot at some
point in the future, but for now, I’d say I’m happy. That being said, if anyone has any ideas for
improving Volleybot itself or any ideas of other cool projects I could work on in the
future, please let me know in the comments section. As for the code, it’s not available right
now, but maybe if I find the time I’ll clean it up a bit and release it. And, of course, if you want to see more consider
subscribing and following me on Twitter. I can’t guarantee that everything I upload
to this channel will be high quality entertainment, but if there’s one thing you should know
about me by now, it’s that I like to see the numbers go up. That’s all I’ve got for now, see you hopefully
not too far in the future.

100 thoughts on “How I Cheesed the VOLLEYBALL Challenge in Super Mario Odyssey

  1. I've got a new video about a Twitch Plays Pokémon-style challenge run I ran called YouTube Plays Super Mario Odyssey! Check it out: https://youtu.be/QPbq4oTB_kI

  2. Beat mario odysey autonomously every control has to be already typed in code or computer calculates it. (Probably use Assist Mode). Maybe make a perfect bot beat speed runners?

  3. Im fine with figuring out ways to cheese it but when people glitch to get to the top of the boards, its really irritating. Why cant people just play the fucking game instead of doing the bullshit? People like you ruin the fun, all the fucking time.

  4. 2:11

    Two issues.
    1. That's an obvious ripoff of Pannenkoek. (And even I don't like his stuff that much.)
    2. THAT MUSIC IS AWFUL. First time using a MIDI maker?

  5. Watch the video in 1.25x speed to hear it in full speed. No idea why pimanrules slows his script down by that, but it works at 1.25!

  6. This video could’ve been 9:59 minutes long if he took out all the random pauses and sounds, and talking like a human.

  7. you could also have reseted capi when no ball was detected on the court instead of using the counter

  8. The wildest part of this, is that you recorded the whole video with that speech pattern without thinking/realizing it would be crazy annoying.

  9. I've discovered a weird, happy little affinity for videos that explain glitches and game mechanics by actually talking about the game's code. Stryder7x's videos on Paper Mario scratch the same itch for me. Maybe I should learn a little bit of Python?

  10. Not done watching the video, but when the ball moves left you could have used that as a signal that cappy should return to the middle

  11. So, I'm not really into coding, but I feel like there's a way to spoof this with some predictive code.

    I mean, the ball always originates from the same point, and moves in a straight line. So, if you wrote code that tracks the direction and velocity of the ball (probably using the shadow as a guide), Cappy could be where the ball is going before it gets there.

  12. Before looking at your picture. Because who looks at urls these days. I pronounced your name as "peeman" rules in my head.

  13. All of the comments are about the speed of his voice instead of talking about the actual video. I think after the first few popular comments we all get it.

  14. Two tips:
    1. Use HSV color space instead of RGB to isolate colors. That should provide much easier color tracking.
    2. Your latency issue can be mitigated by estimating the velocity of the ball and then extrapolating the estimated position of the ball in the next set of frames. This works just like volleyball in real life. You don't try to chase after the ball, you try to chase after where it will land.

    Edit: HSV not HSL

  15. Hey, love your Mario videos! It seems the last iteration failed because Mario got outside of the screen. My first instinct would be to instead of increasing the screen space Mario is detected, when the very dark spot from the red feed is lost, see which corner it was previously (top, top right, bottom, bottom right or right) and move him back until he is detected again.

  16. the #1 thing you can do to improve volleybot is to just simply refresh it each 5-10k or every few hours.. just pause the game. reboot your pc or whatever else that could have built up some lag tendencies, and also that way as well, you can have access to your computer during the break, and just resume it now that the script, etc, are refreshed.

  17. You can still perfectly understand him at double speed too. I do not know whether to be impressed or concerned by this discovery. Either way that was pretty cool Pimanrules nicely done

  18. 5:16 “In…that case…the darkest part of the green channel…ends up being Cappysssss…shadow….whhhiich…really isn’t a problem” 😩😴 This video could of been 5 mins long.

  19. Throwing Cappy up and then using the Homing Throw is incredibly clever. I figured the Homing Throw made it feasible but wow. So that’s how the pros get past 1000.

  20. Couldnt even finish the video with the way you talk.. you…. kindaaaaaa…. sounnnnnnddddd liiikkkeeeeee….. an… autist
    [

  21. Make passing the edge a bad thing so it learns to make sharper quicker movement and then when it hits the ball a very good thing so it learns to try and hit the ball instead of going out of the boundary

  22. Do you speak at .75x speed in real life, or is this just intentional to fuck with the YouTube algorithm?

  23. I was thinking about an experiment I'd love to do, but I'm not very technically minded, so I'd be very grateful to get your opinion on it.

    I was wondering if you think it would be possible to create a program on your PC that could take a blueprint file of sorts and then build a level in Super Mario Maker 2 via a microcontroller. I didn't think it would be extremely difficult when I first thought of it, provided that the cursor in the editor started at the upper left corner and the initial course settings matched a particular set of defaults. My assumption would be that the microcontroller could be programmed to know how long to press the control stick in a particular direction in before moving the cursor over the next block in the grid and simply work its way from top to bottom, then moving one block width to the right and working back up, etc, placing the appropriate part in each section as it goes. The cursor would probably look kind of like a printer head moving back and forth across a piece of paper, only slower. There could be some complications when dealing with elements that can overlap though, such as semisolid platforms, coins, bumpers, vines . . . and quite a few other elements actually. So in certain cases, some items cannot be simply created on the grid position where they are intended to be placed. They have to be placed in the course in a blank location and then dragged into position. So perhaps the file containing the layout of the course would need to be more specific about things like where objects are placed initially and where they would need to be dragged to. Other elements can be resized, which requires having access to adjustment points on the element, usually a corner.

    There are definite complications to this sort of thing. But the reason I want to do it is because I would love for people to have a way to share Mario Maker courses with each other, even if they don't have Nintendo's online service. Of course, this would require people to either start out making their course outside the game itself and building a blueprint file, or having a way to convert their course into a blueprint file, which is a whole new task entirely. That's the other thing. One of my pet peeves with Mario Maker is how Nintendo will sometimes delete a course people spent tens of hours working on, and for seemingly no reason. Some people opt to rebuild the courses from scratch, but I kind of feel like they shouldn't have to. It would be so cool if a piece of software could analyze a course by its visual design and create a set of instructions for a microcontroller to follow to rebuild it. Maybe it's a bit crazy, but I'd love to figure out how to do that.

  24. I figured out how this is past 17 minutes (although this would hit 10 minutes anyway).
    He talks in slow motion in real time, with the slow music, while gameplay is at normal speed with no alter.

  25. Apparently you can emulate motion controls for switch somehow because I’ve seen YouTube plays pokemon do let’s go with a throw command

  26. Really disappointed in everyone making fun of his voice. Like you toe-sucking, basement dwelling fucking losers are any better. Grow up.

Leave a Reply

Your email address will not be published. Required fields are marked *