Passion project 1: Man looking at starry sky made in photoshop

Passion Project 2: Rick and Morty in their ship with lake in background made in photoshop

Passion project 3: A orange truck made in adobe illustrator

Passion project 4: A sponge bob popsicle made in adobe illustrator

Remembrance Day poppy, in Vector form made in illustrator

Tech Stories

1. Should States Be Allowed to Make Their Own AI Rules?

What’s the Issue:
Right now, the U.S. government is trying to pass a law that would stop states from making their own rules about Artificial Intelligence (AI) for 10 years. Some leaders think having one national set of rules will help tech companies grow faster and make it easier to compete with other countries like China. But over 260 lawmakers from all 50 states say this is a bad idea. They want to keep their right to make local laws to protect people from things like deepfakes, scams, and online threats.

My Perspective:
I think states should be allowed to make their own AI rules. Technology is changing fast, and not every state faces the same problems. For example, a big city might have more issues with AI-created scams than a small rural town. If states can act quickly, they might be able to stop harmful AI use before it gets worse. Also, local leaders often understand their communities better than the federal government does. It’s risky to stop all states from doing anything for 10 years when AI is changing so quickly.

2. Google’s AI Can Make Fake Videos That Look Real

What’s the Issue:
Google made a new AI tool called Veo 3. It can create fake videos that look extremely real — like videos of protests or people doing things they never actually did. These videos could be used to spread lies, especially during elections or in other serious situations. Google has added some safety features, like blocking violent prompts and adding hidden watermarks, but experts say it’s still too easy to misuse.

My Perspective:
I think this kind of technology is scary because it’s hard to tell what’s real anymore. If someone sees a fake video that looks real, they might believe something that never happened. That could cause panic or even violence. I also think companies like Google should test these tools more before letting the public use them. We already struggle with fake news online — deepfake videos would make it worse. We need stronger rules about who can use this kind of AI and for what purposes.

3. A Coinbase Data Leak Started at an Outsourcing Company

What’s the Issue:
A company called Coinbase, where people buy and sell cryptocurrency, had a serious data leak. This happened because a worker at an outsourcing company in India secretly took photos of customer information and gave them to hackers. These hackers may have used that info to try and scam Coinbase users. After the leak, the company fired over 200 workers in India.

My Perspective:
This situation shows the risks of outsourcing important jobs like handling private customer data. While outsourcing can save companies money, it can also make it harder to control who sees sensitive information. I think companies like Coinbase should be more careful about who they trust with customer data. Maybe they should invest more in stronger security or use workers who are trained better and monitored more closely. Trust is important in tech, and losing it can hurt both the company and its users.

NHL predictions

Recommended Posts