Inside Man vs. Machine Hackathon

Machine Learning


There is Eric Chong, 37, who has a dental background and previously co-founded a startup that simplifies dental care costs. He was placed on the “machine” team.

“I honestly say I feel very relieved to be part of the machine team,” says Chung.

In the Hackathon, Chung had built software that used voice and facial recognition to detect autism. Of course, my first question is: wealth Are there any issues related to this, such as biased data that leads to false positives?

“Short answer, yes,” says Chung. “I think there are some false positives that may come up, but I think the voice and facial expressions can actually improve the accuracy of early detection.”

agi 'tacover'

Like many AI-related things in San Francisco, coworking spaces are tied to effective altruism.

If you are new to the move through bomb fraud headlines, you are trying to maximize the profits that participants can make using their time, money and resources. The day after the event, the event space held a discussion on how to use YouTube.

On the fourth floor of the building, flyers covered the walls. “AI 2027: Will Agi Tacover” shows breaking news from a recently passed Taco party.

Half an hour before the submission deadline, the coder munched on Ike's vegan meatball submarine and rushed to finish the project. The judges one floor downstairs have begun to arrive. BrianFioca and Shyamal Hitesh Anadkat are Marius Buleandra from Openai's applied AI team and Varin Nair, an engineer at AI startup factory (also collaborating on the event).

As the review began, Nate Rush, a member of the METR team, showed us an Excel table in which the AI-powered group colored green and human projects red. When the judges made the decision, each group moved up and down the list. “Can you see?” he asked me. No, I won't. The coloured mishmash showed no clear winner in 30 minutes of the judgement. That was his point. To everyone's surprise, Man versus Machine was a close race.

showtime

In the end, the finalists were evenly divided. Three are from the “men” side, and three are from the “machine” side. After each demonstration, the crowd was asked to raise their hands and guess whether the team had used AI.

The first was Viewsense. It was a tool designed to help visually impaired people transcribe their live video feeds into text and navigate their surroundings by reading them loudly. Given the short build times, it was technically impressive, and I believed 60% of the rooms (up to the number of hosts) were using AI. I didn't do that.

Next, we built a platform for designing our websites with pen and paper, and a platform for using cameras to track sketches in real time. The Pianist Project reached the final with a system that allows users to upload piano sessions for AI-generated feedback. It was on the machine side. Another team introduced a tool that generates a heatmap of code changes. Important security issues are displayed in red and routine edits are displayed in green. This used AI.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *