+ Add an Icon
+ Add a Cover Image
Archive of hackathon projects
Previously was at http://shariq.io/hackathons ; some projects I never bothered to do a write up on - those are on Github and/or Devpost.
Unorganized links:
PennApps Spring 2013:
A web app which takes a name, finds a group of nouns which sound similar, and creates a wacky story out of those nouns (visualize the story to associate face with name). Came up with a phonetic distance algorithm. Team consisted of me and Ivan Melyakov.
PennApps Fall 2013:
HIPAA Document Viewer
A web app to help medical staff distribute and manage access to sensitive medical documents. Not very exciting, but probably useful, and won the Best Use of Point.io prize. Team consisted of Ivan Melyakov, Jairam Patel, Kyle Headley, and me.
MHacks Fall 2013:
Spottr!
A hack originally intended to track all exercises through accelerometer data which ended up just counting reps. I spent most of my time attempting Pebble integration, and ended up getting in touch with the founder who gave me access to the unreleased Pebble 2.0 SDK which had accelerometer support after I signed an NDA and promised my first born. Full story here. Team consisted of Diego Quispe, Kevin Young, Kunal Sharma, and me.
hackNY Fall 2013:
A hardware hack originally intended to act as a sensory substitution device for sound by sending multiple encoded electric signals to the tongue, each representing a unique frequency band and processed such that the signal contained the information but as 1-100 Hz signals. 15 minutes into training on the device, and 2 hours before the submission deadline, my tongue begun to bleed, and we switched to building a tongue interface to control a mouse, targeted at those with disabilities. Won the Most Artistic prize (lol). Team consisted of me, Kunal Sharma, Kenny Durkin, and Scott Shevrin.
HackMIT Fall 2013:
Shock with Friends
Originally we intended to build something to generate power by walking to charge a cellphone - used an inflatable balloon attached to a person's shoes and captured air with a fan. Did not know this is an active field of research with few promising results. The power generated was not even one hundredth of that required to light up an LED, so, in the last hour, we switched to creating a game where players "shock" each other and have to guess who shocked them. Shocks came through hacked headphones - the website would play a high pitched sound and headphone wires would be placed on each player's tongue. Team consisted of me, Jason Cookman, Sarthi Andley, and Miguel Cotrina.
HackRU Fall 2013:
Metarial/Material
Started out building a voice following robot, had issues with reliable audio sampling at high enough rates on the Arduino Uno, and switched last moment to this project. Did not contribute to building or thinking of this project, but helped with the demo. Won Best Hardware Hack. Team consisted of Michael Gubbels, Polina Vorozheykina, and me.
HackDuke Fall 2013:
Tempo API
Started out on a JavaScript game which relied on the tempo of music to set the pace of the game; created this API as an abstraction to use within the game, decided to split teams. This ran off a laptop I borrowed from Jose Medrano, from within a MATLAB terminal. Notion of "tempo" took into consideration bpm, derivative of bpm, volume, and derivative of volume on different frequency bands. API took search terms, looked them up on SoundCloud, and returned vector with tempo at each 0.5 second interval, along with a link to the piece of music. Team was originally me, Jose Zamora, and Charles Zhuang; after the split I was the only member of the team.
MHacks Spring 2014:
Immersive
Integration of Kinect and Oculus Rift using Touch Designer. I and Jackson Geller were responsible for using the Bloomberg API to grab stock data and pass it to Touch Designer. Jackson got the crappy DLLs working, but my solution to passing data from the API to Touch Designer failed at the last minute after hitting Google App Engine's request limit, and we ended up demoing only the integration of Kinect with the Rift, accomplished by Eric Mintzer in around 10 minutes. The positional tracking of the head, even with crappy Kinect v1, significantly improved the DK1 experience. Team consisted of Eric Mintzer, Jackson Geller, and me.
BoilerMake Spring 2014:
Move Music
Reverse dancing - dance in front of the Kinect and have Touch Designer create music that sounds awesome and fits your dance moves. Ended up sounding horrible, but Manjur made it sound awesome. Fun project, using Touch Designer for the first time. Team consisted of me, Manjur Ahmed, Yoonshik Hong, and Jonathan Peer.
PennApps Spring 2014:
Hang Glider Simulation
Hardware hack where a Pebble's orientation controlled a rudder on a glider, and the glider streamed to an Oculus Rift, both over 4G, allowing the possibility of dropping the glider from a very high altitude. We never got around to dropping the glider from a quadcopter, since the glider didn't glide very well. This was absolutely horrendous VR, but an interesting project. Team consisted of me, Albert Guo, and Ibrahim Hashme.
McHacks Spring 2014:
Toastt
Started out working on Buzzfeedify, creating an algorithm to generate interesting Buzzfeed articles from a Facebook profile. Since I wrote the entire script in one shot on Pastebin I ended up having an indeterminate number of bugs to fix, and let the rest of the team know to switch to some obvious algorithm. I then switched to trying to make a fun collaborative drawing game, where everyone must draw the Platonic form of an object since the winner is the one who draws whatever is closest to the average drawing. Did not finish in time. Buzzfeedify team consisted of Wajahat Siddiqui, Alexa Greenberg, Aboli Kumthekar, and me; Toastt was just me.
HackNC Spring 2014:
Project Equilibrium
A cryptocurrency arbitrage bot which returned cycles of cryptocurrency trades that would result in a profit, taking into account only trades that would occur instantly using an exchange's order book. Also returned maximum volume which could be put through the cycle. We came across multiple cycles that would result in 0.1-0.2 BTC profit, and demonstrated that arbitrage is a largely unexploited market inefficiency on cryptocurrency markets. Won Best Cryptocurrency Hack, Best Hack by Team from Multiple Schools, and placed in top seven. Team consisted of me, Smith Mathieu, Kenth Krueger, and Collin Bober.
HackDuke Spring 2014:
Great Heights
An immersive VR simulation for people with a fear of heights, to help them overcome their phobia. Participants hold their hand out and grasp at a virtual lever which allows them to control their upwards speed. Letting go of the lever caused the virtual platform the participant stands on to immediately stop moving. Jake hacked a fan from a motor and soda can, which we put on a cardboard box around the user for air flow. Made with Oculus Rift, LEAP Motion, and Unity. Team consisted of me, Trey Bagley, and Jake Rye.
Bitcamp Spring 2014:
Sixth Dimension
We aimed to build a bunch of cool VR things and eventually stick them all together. I worked mostly on getting Kinect V2 data into Unity, and mapping that onto a skeleton in Unity. Jake worked on building a fan array to stick on the Oculus Rift. Michael worked on LEAP motions attached to hands, MindWave, and helped with Kinect V2. Omar worked on character rigging and a spaceship scene; the idea being two people would be in VR together and attempt to breakout of a failing spaceship, and wind would indicate structural defects in the spaceship's hull. Ended up demoing only Jake's fans. We placed in top 8 (Bitcamp has no first/second/third) and won a DK2 for Most Dangerous Oculus Hack. The Kinect V2 data reached Unity, but I had troublemapping it onto a skeleton. Team consisted of Jake Rye, Michael Gubbels, Omar Soloman, and me.
LA Hacks Spring 2014:
Tred
We built a device that allows users to move in virtual reality by walking in the physical world and constraining their movement. Used Oculus Rift, Unity, Arduino, and lots of hardware. Won first place. Team consisted of me, Jake Rye, and Charlie Hulcher.
VTHacks Spring 2014:
The Oculus Construction Kinection
By hooking up the Kinect V2 to Unity, we made a VR environment where a user spawned and manipulated objects through hand gestures, while walking around and wearing a Rift (DK1). Built in Unity. Won second place. Team consisted of me and James Shaffer.
HackBU Spring 2014:
First Person Mechanic
Kinect v1 + Rift + Unity to assemble and disassemble mechanical systems - maybe for education, training, or fun. Won first place. Team consisted of me, Yoonshik Hong, and Josue Cruz.
NUI Central Kinect for Windows Hackathon 2014:
VR 3D Modelling
Tried to do really good 3D modelling witha Kinect v2 and DK1. Failed horribly; laptops kept breaking down, started late, spent too long brainstorming, set up network to stream Kinect v2 data over ethernet to Windows 7 machine with the DK2, etc. Had many parts working separately, but had trouble putting them together. (e.g, saying "upload" and having 3D mesh get uploaded to Thingiverse). The interface we designed was very interesting. One of the members of the winning team was super awesome and gave me the Kinect v2 he won (?!). Team consisted of me, Jake Rye, Yoonshik Hong, Eric Solomon, and Josue Cruz.
Greylock Hackfest Summer 2014:
VR 3D Modelling
Attempted VR 3D modelling again. Again, loads of problems... Tried to use LEAP, Kinect v2,and DK2. Completely nuked the machine after some bad linking within Unity; had to reinstall Windows twice. Pretty interesting interface, again (different one). This time, LEAP was being used in a small area and Kinect v2 in a larger area;and the stretch goal was to have the LEAP follow a person's hand by sticking it onto an XY table. Did not even have a demo because the DK2 machine kept breaking; ended up walking into judge room with Waj wearing an unplugged DK2 and trying to be funny (did not work lol). Team consisted of me, Dan Gillespie, Wajahat Siddiqui, and Zach Fogg.
YC Hacks 2014:
Tablet God
Me and Ian Field from Oculus were talking about VR input and he had an idea involving a tablet in VR so that users have a familiar gesture interface. So my team tried to make a prototype by putting fiducials on a DK2 and on a tablet, and sticking a webcam on the DK2 to watch the tablet. We got the fiducial tracking working, with a live pose, but I failed at doing a simple coordinate transformation to bring the tablet from camera space into world space and botched the tracking. Ended up demoing Angry Birds in VR, with a tablet for input, and people loved it anyways. Team consisted of Ben Cohen, Geoff Vedernikoff, Alex Wissman, and me.
MHacks Fall 2014:
Chicken VR
Had a hard time coming to agreement on a project; then decided to make a chicken dance simulator. Used two Myo armbands to track a user's wings, a DK2, and a chicken costume. Built in Unity. Got most of the way there; but code kept breaking. Had a demo with a chicken dancing to the beat of the chicken song, but the animations were all left wing flaps. The user's wings were also being tracked but they couldn't tell since the wings weren't shown in the scene. Team consisted of me, Jake Rye, and Gurpreet Singh.
PennApps Fall 2014:
Ridecoin
Started out trying to do filling in of the textured front facing mesh of a person grabbed from a Kinect v2, for telepresence without a massive setup. Switched to Ridecoin after failing at writing a Unity shader. Ridecoin was distributed ridesharing on top of the Bitcoin protocol, with web of trust mechanics to incentivize good behavior. Protocol was interesting, but the mobile app didn't work well (begun at 3 AM Sunday). Team consisted of me, Ishaan Gulrajani, Zain Shah, and Zach Fogg.
HackTheNorth Fall 2014:
Follow the Leader
Bought a Hello Kitty kids' car to turn into a robot that would follow people around to carry their stuff. Used a Kinect v2 to track person, and wave gesture to indicate which person to follow. Underestimated difficulty of actuating steering on the car, and spent the majority of the hackathon trying to get steppers to work as expected from the Arduino. Software to track and follow used lots of math and was tested without being connected to actuators but seemed like it would have worked well enough. Why can't we buy simple robots like this already? Team consisted of me, James Shaffer, Darpan Shah, Jyna Maeng, and Ramkesh Renganathan.
HackRU Fall 2014:
VR Music Generation
Attempted to make music in VR with Kinect v2 and DK2. Got the coordinate systems mapped to each other, grabbed lots of sound clips, created the scene, the user could see themselves in VR: but the hand gestures were mysteriously not working, and the tracking was also horrible. Turned out there was a bug in the Kinect v2 SDK for that specific graphics card.. Team consisted of me, Daghan Percinel, and Darpan Shah.
BoilerMake Fall 2014:
Telepresence
Attempted to stream 3D video from multiple Kinects to a user wearing a DK2 who would talk to a person within the 3D video. Everything was ready to rock (we were just going to stitch the facades together), except that I couldn't find documentation on the coordinate transformation from the depth camera to the color camera of the Kinects in Unity, even though I knew it existed. Found out you can buy Kinect v1s at the right place (GameStop, second hand) for only $25 now! Team consisted of me, Neville Jos, and Vishal Disawar.
HackNC Fall 2014:
Pulser
Originally attempted to detect outlier events in the live Twitter stream. Ended up working on predicting weather outliers, and the solution ended up working extremely well. Most fun part was deploying on 15 Digital Ocean droplets. Code is all here. Team consisted of me, Tom Cornelius, Anthony Castrio, Josh Preuss, and Daniel Manzella.
Y-Hack Fall 2014:
Thistle
Made a multi-user Python REPL. Open source. Front end didn't work properly. I am occasionally rewriting this so it doesn't rely on Firebase. It would be interesting to see the different ways people can program collaboratively, other than just writing to the same set of files. Is sharing the same namespace in a REPL a bad idea if you can chat at the same time? Team consisted of me, Tim Zulf, and Amir Kashanipour.
Stupid Hackathon Fall 2014:
BattleBot
Worked on rap battle bot. Generated rap lyrics through some Markov chain stuff, rhyming APIs, and clever code, then rapped it with the say command on OSX. I wrote a bunch of Octave code to synchronize beat with rap, but it didn't work; so I ended up contributing 0 to demo. Team consisted of Jonas Jongejan, Yuki Yoshido, and me.
Design & Hack Fall 2014:
burgundy
Made a nice website for project names. Open source, and the website should be up. Second hackathon that weekend, right after Stupid Hackathon! Team was just me.
Dragon Hacks Spring 2015:
Portable Shower
Built an inflatable shower out of tarp which fits in a backpack: for the urban explorer. Requires sink and power outlet (surprisingly most public restrooms have power outlets). Had pumps near the bottom to pump water back out into the sink; had big zip loc bag inside to keep clothes dry when you took them off; had small holes in the structure releasing air to help dry hair. Team consisted of me, Gurpreet Singh, Harman Anand, and Victor Lourng.
PennApps Spring 2015:
Speech to Scene
Attempted to create convincing speech to scene conversion: describe a static scene and have it appear around you in virtual reality. e.g, "I am in Paris, with a table in front of me. There's a book on the table. The book is red." Locations were going to be 360 panoramas, and we had to build our own pipeline for reliable 3D model lookup (the best metric was a convnet doing object detection on the rendering of the 3D models). Full source here.