Baseball (take II) and Kinect eco/swimming [NOC Final proposal]

UPDATE II: Two final ideas. I’m going to finish one, try to finish the other.

UPDATE: I’ve come up with an idea that, hopefully, will work.

Kinect swimming/eco

My Assistive Tech project has to do with motivation. It’s about making rehab, after a stroke, more bearable. But when we demoed the project, one of the users said it was therapeutic — and that she’d love to use it for her tai chi. Now, I don’t know anything about tai chi. But I do know about relaxation — and swimming! So how about a program that encourages someone to “swim” in their living room by exploring a virtual space?

So I’m thinking of adapting the Assistive Tech project, and forming one with a more therapeutic purpose. The other project requires that you hit a certain spot to move forward. But what if you can move forward by simply wading? And what if you can swim left and right? Ah, what a concept!

Now, to get any real exercise, you may need to hold weights or something.

One more thing! I want to add some type of game element to this. Ideally, it would be an ecosystem where YOU are one of the species. And YOU have to find food. And YOU have to, uh, mate? Yeah, I don’t know. We’ll see how that works out.

Anyway, it might be interesting to see whether I have to represent the fish/person in the water — or if I could make the user guess what kind of fish they are by looking at their offspring.

OK, I’m starting to like this. We’ll see if it sticks…

Smart Bubbles (the above project swallows up this one… maybe)

For Assistive Tech, my group is creating a program in which we use the Kinect to “pop” bubbles with our hands. (Here’s a link to that blog post, updating progress.) Right now, the bubbles — which are the target — appear in random spots, or in the the same spots. But what if the bubbles were smart? What if they could evolve to “survive” so it becomes harder to hit the bubbles?

Now, we haven’t learned genetics yet — so I’m not entirely sure how I’m going to do this. But I want these bubbles to know where they won’t survive, after a certain number of “hits,” and I want them to appear further away. That way, the occupational therapists don’t have to manually change the bubble locations. Instead, they can simply run the program, and it will be smart about where to appear.

However, one potential problem: We are working with stroke patients, and they need to perform consistent motions. So this might not be too helpful for them. BUT it might be helpful for healthy people, who want a stretching exercise. Still working on the idea, though…

Field Of Seems, Take 2

First, I want to re-do my ICM final from last semester, using more realistic physics. I called it Field of Seems, and my attempt was to use the Pitch f/x database to re-animate real-life pitches from Major League Baseball. Of couse, I faked all the physics. But this time around, I’m hoping to emulate physics a little better.

The plan is to use an initial applyForce(throw) function to exert acceleration upon the ball at the initial point of release. That “throw” number will depend on the velocity of the pitch. Then I’ll have another applyForce(friction) function, which slows down the ball as it approaches the plate — since that’s what happens in real life, because of air friction. Lastly, to take care of the break, I need to use an applyForce(break) function — but only apply it at a certain Z length, since it takes time for the ball to begin it’s break.

The hardest part about this is that the Pitch f/x database simply records real-world results. It does not tell me the initial conditions. So while the database will tell me the start location and end location, I can only set it’s start location. The acceleration and break needs to take care of the end location. This could be a huge pain in the behind, but we’ll see how it goes.

A Kinect Ecosystem

I want to create an ecosystem much like my NOC midterm — except I want to add three key elements.

First off, I want to add an ecosystem quality. I want it to be able to sustain life without any manipulation on my part.

Secondly, I want the creatures to have a genetic makeup, so they will evolve more realistically — and not just using color, like I previously had.

Lastly, I want to be able to explore this world via Kinect motions. I plan on using OSCeleton to get specific joint movements. But, frankly, this is the easiest part.

This idea is still very raw. I also wonder what it would be like to link the genetic makeup of each butterfly to a tweet. I’m not sure how that would work, but I think it could be interesting … or at least it sounds cool.

BUT — like I said — I don’t love either of these ideas. I’m going to finish Field of Seems, simply because it might be useful for my sportswriting gig. But I want to do something more. I want to use genetics to fish out something more than visual change; I want to optimize something in real-life, which is why I’m looking through databases. It might be cool to use evolution and genetics with Field of Seems, but I’m not quite sure what the ideal use for it would be.

This is still in the works. I’ll update as I think more….

Leave a Reply