Victoria Web Development Portfolio

NHL Data Science Project

Victoria

Remembrance Day Poppy

Here is my poppy:

Technology News Stories

https://www.cnbc.com/2025/06/06/apples-wwdc-ai-strategy.html

At the 2025 Worldwide Developers Conference (WWDC), Apple is anticipated to unveil updates to its AI suite, Apple Intelligence, focusing on practical enhancements rather than groundbreaking innovations. Expected features include smarter battery management, personalized health insights, and improved messaging capabilities, such as polls and quick translations. However, significant overhauls, like a revamped Siri, are likely postponed. These incremental updates aim to enhance user experience by making devices more intuitive and efficient in daily use. 

From my perspective, Apple’s strategy of implementing subtle AI improvements reflects a commitment to user-centric design, prioritizing functionality over flashy features. While some may view the absence of major announcements as a lack of innovation, I appreciate the focus on refining existing tools to better serve users’ needs. This approach aligns with Apple’s reputation for delivering polished, reliable products, and I believe it will contribute positively to the overall user experience.

https://www.cnbc.com/2025/06/06/tesla-jumps-5percent-in-premarket-trade-as-stock-reels-from-trump-musk-drama.html

In early June 2025, Tesla’s stock experienced significant volatility due to a public dispute between CEO Elon Musk and President Donald Trump. The conflict began when Musk criticized Trump’s “Big Beautiful Bill,” a tax and spending proposal that threatened to eliminate electric vehicle tax credits, potentially reducing Tesla’s annual profit by $1.2 billion. In response, Trump threatened to revoke government contracts with Musk’s companies, leading to a 14% drop in Tesla’s stock, erasing approximately $152 billion in market value. However, the stock rebounded by over 5% the following day amid speculation of a possible reconciliation between the two figures 

From my perspective, this incident underscores the vulnerability of companies to political dynamics, especially when their operations are closely tied to government policies. The rapid decline in Tesla’s stock highlights how political disputes can have immediate financial repercussions for businesses and their stakeholders. It also raises concerns about the stability of government support for industries like electric vehicles, which are crucial for sustainable development. This situation serves as a reminder of the importance of maintaining clear boundaries between political interests and corporate operations to ensure economic stability and investor confidence.

https://www.cnbc.com/2025/06/06/uk-fca-to-lift-ban-on-crypto-etns.html

The UK’s Financial Conduct Authority (FCA) has announced plans to lift its ban on retail investment in cryptocurrency exchange-traded notes (ETNs), marking a significant shift in its approach to digital assets. Previously restricted to professional investors due to concerns over risk, ETNs are debt securities that track the value of cryptocurrencies like Bitcoin and Ethereum without requiring investors to own the underlying assets. The FCA’s proposal aims to support the growth and competitiveness of the UK’s crypto industry while allowing individual investors to make informed decisions about high-risk investments. However, the ban on retail trading of crypto derivatives will remain in place, and investments in crypto ETNs will not be covered by the government’s compensation scheme. 

From my perspective, the FCA’s decision to reconsider its stance on crypto ETNs for retail investors reflects a growing recognition of the maturation of the cryptocurrency market and the increasing demand for diversified investment opportunities. While the risks associated with crypto investments are undeniable, providing regulated avenues for retail investors to access these markets can enhance transparency and consumer protection. This move could position the UK as a more attractive hub for crypto innovation, aligning it with global trends and fostering a more inclusive financial ecosystem.

Adobe Photoshop Cat Bubble Tea Cafe

Adobe Photoshop Cat Bubble Tea Poster

Loopys Fruit Website

Physics Google Sheet Graph

Joah’s 2025 Portfolio

Video Walkthrough of Site

NHL Data Science Project

Joah Boland-Landa

Remembrance Day Vector Poppy

Mailchimp Campaign

My Tech Article

Nightshade Image Protection:

Nightshade is an open-source filter for artists to protect their works from image generation programs.

Ever since their introduction, generative AI image generation models and their demonstrators faced backlash and criticism for oftentimes taking real artists pieces and scraping or copying them to train their models, all without the artist’s consent. There are questions of copyright infringement,  intellectual property misuse, and a general feeling that artists didn’t consent to have their Works contribute to a tool that undermines their work. and for a while, there was little they could do about it. Efforts to sue for copyright infringement often went nowhere, opt-out lists often aren’t respected, and it was hard to find any solid proof that their work was being taken. So efforts then moved to trying to protect works that they would make in the future. and thus, led to the development of the Nightshade image protection program. 

This led to the University of Chicago to develop Nightshade, a program meant to confuse AI image generation models into thinking your image is something entirely different. To quote from their website, “Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into “poison” samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.”

This groundbreaking tool is the artist’s only saving grace to protecting their images other than another similar program called Glaze,  however this one uses an offensive rather than defensive approach. This program actively breaks the AI scrape programs that try to take pieces just from the internet without people’s consent, and makes them fundamentally not understand what they’re looking at. all while keeping the original image mostly unchanged and still understandable to human eyes. Again, to quote from their website, this is how the process works:

“Like Glaze, Nightshade is computed as a multi-objective optimization that minimizes visible changes to the original image. While human eyes see a shaded image that is largely unchanged from the original, the AI model sees a dramatically different composition in the image. For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass. Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo.”

However, some detractors of the program claim that AI image generation models have already moved past the filter, and can still understand these images even with it applied. This is simply not true, and it was something taken out of context from a faulty news source. The program still works in its current state, although many artists understand that it will not work forever. Many artists simply use it to send the message that they don’t want their works to be used for generation programs. When OpenAI was asked about how they feel about such programs being used to protect pieces of art, the company said that they take attacks against their systems “very seriously”  and “are always working on how we can make our systems more robust against this type of abuse.”

The only actual problem that using such a filter provides is ironically marking any real piece of art with the filter applied as AI generated. Because of the way the filter works, not only does it protect against artificial intelligence models, but it also makes models looking for And I generated art think that these real pieces are also Ai generated, and flag them. This might send the wrong message. Certain artists who really did put in the time and effort to make their art pieces could get flagged for AI generation, and could even have their pieces removed in certain spaces even though they did nothing wrong. This is because the way the Filter Works makes these detection tools wear up, because it has a lot of the same tells that artificial intelligence models art pieces due to, even though the artwork itself visually doesn’t look any different.

Overall, there are clearly some issues to still iron out with the program, like the aforementioned one above. However, for many artists, benefits far outweigh the downsides especially considering their livelihoods and their art pieces are very much on the line.