• § Our first red cherry tomatoes have ripened. They are a few weeks behind our crop from last year but no less delicious. We’re still waiting on the yellow and black cherries.

    I also ate my first home grown summer squash of the season. The plant has been uncharacteristically prolific but, for whatever reason, many of its smaller fruits are suffering from blossom end rot. I tried supplementing with calcium and we will see if that helps.


    § OpenAI took its AI detection tool offline due to low accuracy. This isn’t surprising—I’ve seen a number of stories about professors rejecting student essays after they were inaccurately flagged as being AI written.

    I suspect OpenAI primarily developed its AI detection capabilities for internal use, to avoid “model collapse” by filtering today’s AI generated content out of future training runs. When used this way, an overzealous classifier is totally fine. Sure, you might filter out some genuine content but that isn’t a huge deal. When the same classifier is used as the sole means to judge the authenticity of student work, however, false positives start to become a lot more impactful.


    § We got a new stove! It is a lot like our old stove except, in this one, the oven actually functions properly. The burners supposedly have a higher BTU rating too but that hasn’t had any noticeable impact, in practice.


    § I started digging a hügelkultur which I have been mistakingly calling a hinterkaifeck.

    You might recall that I have been spending the summer sawing down a lot of tree branches and overgrown bushes.

    The volume of branches this has produced has been challenging. Sure, making a wattle fence has been great fun but that only uses so much wood. I’ve started to annoy my city’s garbage collection service with the bags upon bags of sticks and leaves I’ve been attempting to throw out each week.

    So, hey, maybe a hügelkultur would be perfect. You just bury some wood in the ground, plant things on top, and, as the wood breaks down, it feeds the growing plants above with a continuous supply of carbon.


    § Links

    § Recipes

    I tried making cheese with raw—scratch that, “pet”—milk that I purchased from a local Amish grocer.

    I spent a while trying to decide between a queso fresco and a farmer’s cheese before I realized that they are essentially the same thing. Mozzarella felt a bit too ambitious as an entry point.

    The whole process was not as difficult as I expected. Warm up the milk, add acid, wait, scoop out the curds, drain. That is really it.

    I also made ricotta with the leftover whey. I still have so much more whey left to use, though. I’ve read that you can use it in place of water in bread doughs. It is also supposed to have some unique qualities when used to soak beans. Some people make whey lemonade.

    This whole experience, while fun, makes me disinclined to ever want a dairy cow. At more than seven gallons of milk per day, I would need to get way more into dairy before that ever became anything more than burdensome.

    A goat on the other hand…

  • § Each of our pepper varieties—jalapeño, cayenne, poblano, banana, and Hungarian—are all growing fruit.

    Thanks to some early pruning help from the local deer population, two of our cherry tomato plants are still a manageable size. The San Marzano, however, is as vigorous as ever.

    I don’t even know what to say about the blackberry. It has already totally outgrown the patch I planted it in last year. I am both proud and overwhelmed.

    Our purple passionflower, which I had thought hadn’t survived the winter, has come back with a vengeance. There are shoots popping up as far as four feet away from the original plant. Researching more now, I see that it is considered invasive in some areas.

    Humid weather earlier in the week gave way to a violent, cathartic, storm Thursday night complete with thunder, lightning, and hail. Relentless sheets of rain flattened our young, top heavy plants.

    By the next evening, just about everything was able to perk itself back up. The only casualty was a large, fruit bearing stem of one of the cherry tomato plants.


    § I rode an e-bike for the first time. As an exercise device it was, of course, less effective than a traditional bicycle. As a means of transportation, however, it might be unbeatable.

    The experience considerably expanded what I can see as a viable car-free commute. Unfortunately, my current commute would be upwards of an hour each way. Even on an e-bike that is not exactly viable.


    § After some struggle, I’ve finally made a breakthrough on my capacitive touch wall mural music sampler project. The turning point came when I decided to stop using MIDI altogether and instead use the Touch Board as a basic USB keyboard.

    This was all made possible by finally finding an application I had been searching for this whole time—a simple, keyboard focused, sampler app.

    The next step for the project is to start prototyping its physical design.


    § I’ve been playing Connections, the latest Wordle-esque puzzle game from the New York Times. The goal is to categorize a four-by-four grid of words into four separate groups based on their commonalities. Sometimes the solutions are straightforward—flute, clarinet, harp, oboe, all musical instruments—but often there are a few words included that make things a little more ambiguous. Each game takes less than five minutes to complete and it isn’t ever difficult enough to be frustrating but never so easy that it feels mindless—a tricky balance to strike for a game of this kind.


    § The Queen’s Gambit was captivating and further evidence for my theory that limited-run series’ are always better than their indefinite counterparts.


    § Links

  • It is not the most creative name they could have chosen, but Meta released a successor to their open source “Llama” language model yesterday.

    Meta:

    We’re now ready to open source the next version of Llama 2 and are making it available free of charge for research and commercial use. We’re including model weights and starting code for the pretrained model and conversational fine-tuned versions too.

    Unlike the original Llama release, Meta took the extra step to license this new model for commercial use.

    As Satya Nadella announced on stage at Microsoft Inspire, we’re taking our partnership to the next level with Microsoft as our preferred partner for Llama 2 and expanding our efforts in generative AI. Starting today, Llama 2 is available in the Azure AI model catalog, enabling developers using Microsoft Azure to build with it and leverage their cloud-native tools for content filtering and safety features. It is also optimized to run locally on Windows, giving developers a seamless workflow as they bring generative AI experiences to customers across different platforms.

    Just unbelievable positioning from Microsoft. Now, not only is their infrastructure powering all of OpenAI’s models, they are now working with Meta to support the leading alternative to OpenAI.

  • § My makrut lime plant has started growing a handful of tiny little fruits for the first time. I originally bought the plant for its aromatic leaves that are frequently used in Thai cooking. In my decidedly non-tropical climate I never expected to actually get any fruit. Exciting!


    § The Exploratorium Cookbook is such an unbelievable resource for building interactive educational experiences. Given my field of work, I am both shocked and a bit disappointed that I hadn’t heard of it until now.

    A crucial detail is that the book doesn’t prescribe regimented, step-by-step projects. Instead, it sticks to cataloging broad concepts and suggests ways one might go about presenting them.

    The way the book is structured as a series of numbered “recipes” reminds of the sequential architectural and cultural “patterns” from A Pattern Language. I wonder if they both could be used in tandem…


    § I have been working with MIDI this week, prototyping different approaches towards creating a capacitive touch wall mural that acts as a musical sequencer / drum machine.

    It has been frustrating to learn that microcontrollers with great capacitive sensing capabilities, like the ESP32, are unable to send MIDI messages over USB. As I wait for a purpose built device to arrive, I’ve been using an old Circuit Playground Express which, frankly, doesn’t work particularly well for this use-case.

    If you ever find yourself in a similar position, using MIDI in uncommon ways, I can’t recommend the application Midi View highly enough. It displays all of the information sent by connected MIDI devices in a straightforward, no-nonsense interface.


    § After resisting for a while, thinking it was just another quirky comedy, I started watching Beef. It’s much better than I expected! It helps that the episodes are short enough that the drama isn’t too heavy and the comedy isn’t too cloying.

    Honestly, it may have been the title cards that first grabbed my attention. It turns out, the wild, maximalist, Egon Schiele-esque imagery was painted by David Choe, one of the show’s lead actors.


    § Does CVS employ cashiers nowadays? Surely they must, if only to approve alcohol sales, but I am not sure I can recall the last time I saw one. They have leaned heavily—more so than any other store I’ve visited—into self-checkout kiosks. The truth is that the future will almost certainly look less like Amazon’s fully autonomous corner stores and more like this.


    § Links

    § Recipes

    • Ginger-teriyaki beef kebabs
      • I still unintentionally overcook my beef but I am gradually getting better. This recipe was delicious, especially with fried rice. I make it twice this week.
    • Lemon butter feta chicken pasta
      • This was one of my favorite meals in quite a while. Cream + chicken broth + lemon juice are a great trio.
  • Ernie Smith:

    The thing that I think made the internet such an interesting place in its early years was because it didn’t feel like a controlled environment. The chaos was everywhere. It was messy. It was grimy.

    […]

    I will not say that this was perfect, but the chaotic effect was interesting, and interesting was often enough to continue using, because it meant there were always new surprises. For lots of people, chaos often breeds new ways of thinking.

    […]

    Threads threatens to be social media’s Disney World.

    Disney World has its place but I am more interested in the ragged edges, the avant-garde, social media’s… Chicago? But also the contemplative, slow, and deliberate—Lancaster?

    Long live the open web.

  • § I touched a floppy drive for the first time ever.

    A lot of my new job, at the moment, involves contending with a huge library of interactive Adobe Flash programs.

    As with floppy drives, I had never worked with Flash before although its legacy has always loomed large in the Creative Coding circles I frequent.

    There is something beautiful about it: an accessible way to create programs that are self-contained, cross-platform, multimedia, and interactive.

    I still really need to find a good, modern debugger, though.


    § Going from working in a single-story building to somewhere with four floors has had a dramatic impact on my “flights climbed” Health metric.


    § Apple’s marketing stunt worked, I watched Silo.

    Some details of the environment were great—I really liked the computer interface design—overall, however, the artificiality of the set was a bit of a turn off.

    I kept finding myself imagining what the actors were experiencing as they were filming each scene. How big was the set? How immersive? Was it just an enormous wall of LCD screens? Watching the later seasons of Game of Thrones was a similar experience.

    Despite finding the design slightly off-putting, the story itself was original. I found myself genuinely surprised by the ending of season one and am sufficiently interested to see where the writers take the show from here.


    § By and large, the new season of Black Mirror feels silly. It also doesn’t seem particularly concerned with technology—you know, the reason it was named “Black Mirror” in the first place. An interesting choice.


    § The day after Meta launched Threads, their Twitter competitor, I finally got an invite to Bluesky.

    Frankly, I am not sure I see a future for Bluesky once Threads enables support for ActivityPub, the underlying protocol behind Mastodon (and this blog). Effectively everyone already has an Instagram account which means they now automatically have a Threads account too. If you don’t have an Instagram account you can join any Mastodon instances and will still be able to communicate with anyone on Threads.

    Bluesky, in comparison, will look insular—even after it leaves its long invite-only phase.


    § Caroline and I took down a large, ungainly juniper bush in our front yard. This gave us two things: a bunch of wood that is naturally rot-resistant and room to start growing a patch of watermelons. In the past, we haven’t had much luck growing pumpkins, America’s second favorite oversized cucurbit. We are hoping for better luck with watermelons and their shorter growing time.


    § Our quail all managed to break loose from their enclosure in the middle of a rain storm.

    Frankly, I am not entirely sure how long they were loose for. It wasn’t until I spotted Tumbleweed’s unmistakable dusty orange plumage as she was looking for shelter under a backyard tree canopy that I realized what had happened.

    Take a moment to picture me, frantic and soaking wet, chasing five small birds around my tiny, unfenced, yard, attempting to catch them with a cheap Amazon.com butterfly net.

    Miraculously, I was able to round them all up and return them back to safely of their enclosure.


    § Links

  • Madhumita Murgia, Financial Times:

    Greg Marston, a British voice actor with more than 20 years’ experience, recently stumbled across his own voice being used for a demo online.

    Marston’s was one of several voices on the website Revoicer, which offers an AI tool that converts text into speech…

    Since he had no memory of agreeing to his voice being cloned using AI, he got in touch with the company. Revoicer told him they had purchased his voice from IBM.

    In 2005, Marston had signed a contract with IBM for a job he had recorded for a satnav system. In the 18-year-old contract, an industry standard, Marston had signed his voice rights away in perpetuity

    The problem isn’t AI here. The problem is that it is possible—and, apparently, standard—to sign vital rights away to companies.

    Not having full license over your own voice, as a voice actor, is ridiculous. It is unconscionable that we have allowed conditions to develop such that it has become an accepted part of the occupation.

    Pavis [a lawyer who specializes in digital cloning technologies] said she has had at least 45 AI-related queries since January, including cases of actors who hear their voices on phone scams such as fake insurance calls or AI-generated ads.

    Okay, AI voice synthesis companies definitely hold some blame here. Generating a new, a non-specific, synthetic voice is one thing, cloning an individual’s unique voice is something else altogether.

  • § It has been a bit of a hectic week.

    I started my new job which has been great but, you know, it’s a new job with new routines, procedures, and coworkers. I have a new Outlook account. I configured all of the shared calendars. I know where the mail room is.

    All of this means the garden has begun falling prey to nature’s entropy.

    I think we will all pull through.


    § Fluctuating with outdoor temperatures, my car gets anywhere between 30 and 45 miles of pure electric driving before switching over to its gasoline engine.

    My new job is closer to home and the parking garage has EV chargers. The last time I filled my gas tank was June 25th. Barring any unexpected road trips, my goal is to make it last to Thanksgiving. Stay tuned.


    § Season two of The Bear feels less electric than the first season.

    The first season was chaotic, stressful, and claustrophobic. Watching it was, at once, both exhausting and energizing—like the feeling in the air walking home from a late night concert.

    Season two has space, tenderness, non diegetic music… It no longer feels cramped, shoulder-to-shoulder, to the tight confines of a hot kitchen. We follow characters as they leave Chicago and experience the wider world. Although the story is driven by an impending deadline, it feels like we do more waiting than rushing.

    The Bear continues to be a special show to me, but, now, I think it is more about Chicago nostalgia than unique story telling.


    § Links

    § Recipes

    • Rosemary sea salt caramels
      • Cooking is art, baking is magic, candy making is an unforgiving science. Timing is critical and temperature variations of less than 5 °F can be the difference between soft, chewy caramels and a non newtonian amorphous gloop. My candies fell somewhere in the middle. I need a better thermometer.
    • Authentic(?) chili con carne
  • Google is no longer working to build an augmented reality hardware platform. They will be shifting their energy towards creating AR software instead. It is hard to believe this wasn’t at least partially prompted by the Vision Pro.

    Hugh Langley, Business Insider:

    Google killed off a project to build a pair of augmented-reality glasses it had been working on for several years.

    […]

    The glasses, known internally by the codename Iris, were shelved earlier this year following layoffs, reshuffles, and the departure of Clay Bavor, Google’s chief of augmented and virtual reality, according to three people familiar with the matter.

    […]

    Since shelving the Iris glasses, Google has focused on creating software platforms for AR that it hopes to license to other manufacturers building headsets… One employee described Google’s new ambition as being the “Android for AR“

    Of course they should build “Android for AR” and sell it to whoever is interested but they shouldn’t let that get in the way of developing great first party applications for all headset platforms.

    The advantage of giving up on the hardware market is that they don’t have to weigh direct competition as heavily in their decision making.

    Meta, especially, must be thrilled.

  • § I will be starting my new job next week. There is no denying I will miss these past few weeks of vacation but, at the same time, I can’t wait to see what comes next.

    On a related note, I am not planning to post daily articles until I get settled into my new routine. I will, of course, continue publishing weeknotes each Sunday.


    § I saw Ari Aster’s new movie Beau Is Afraid.

    Wow, now that is a movie!

    Is it the best film I’ve ever seen? No, but it’s inventive and strange, deeply discomforting and hilarious.

    I wasn’t blown away by Ari Aster’s two previous films, Hereditary and Midsommar. They felt like well-executed examples of generic horror movie tropes.

    Beau Is Afraid is an entirely new idea. It is Ari, like the titular Beau, leaving his comfort zone.


    § More movies —

    After finding Synecdoche, New York way too depressing, I watched Wes Anderson’s two animated films: Fantastic Mr. Fox and Isle of Dogs, looking for a change of tone.

    Previously, my primary association with animation was movies and television directed at children. Without consciously intending to, I consequently viewed it as a less serious medium.

    I failed to appreciate that animation gives artists an enormous amount of control and the freedom to create exactly what they imagine, unbounded by the typical constraints of reality. When you give this technology to a filmmaker as precise and detail oriented as Wes Anderson, the results are spectacular.


    § My san marzano tomato plant, which I am growing for the first time this year, has a distinctly different growth pattern than any other tomato variety I’ve seen before. It is super dense and bushy with very few suckers. Comparatively, my cherry tomatoes are almost lanky and sparse.

    Spider mites have been an absolute garden menace this year. I’m not sure what prompted their sudden invasion.


    § Old honeysuckle blossoms are great for cyanotype printing.


    § Links

    § Recipes

    • Caldo verde
      • A new favorite. It’s very similar to kapusniak, another beloved rustic potato soup, but lighter and simpler overall.
      • I made a few alterations: I used a mix of both leek and onion and added some lemon juice at the end. I also couldn’t find the special Portuguese sausage the recipe called for so I substituted it with chorizo. I’ll be on the lookout for proper linguiça moving forward.
    • Cajun gumbo
      • I definitely burned the roux. I realized it early on but, for whatever reason, decided to just keep going. Big mistake. I ended up letting it simmer on the stove all afternoon—like five hours—and that helped reduce the bitterness. It still had a distinctly burned flavor, though. I’ll try making this again another time. If I hadn’t burned it at the beginning I think it would have been amazing.
  • Apple:

    Apple today announced the availability of new software tools and technologies that enable developers to create groundbreaking app experiences for Apple Vision Pro — Apple’s first spatial computer.

    […]

    With the visionOS SDK, developers can utilize the powerful and unique capabilities of Vision Pro and visionOS to design brand-new app experiences across a variety of categories including productivity, design, gaming, and more.

    This is all a part of Xcode 15 which you can download today.

    Playing around with the visionOS simulator is fascinating. It already exposes a lot of the final operating system—including first party applications and design elements—that I hadn’t previously seen elsewhere.

    The new Reality Composer Pro application is also more powerful than I would have expected. It feels like a stripped down version of Unity3D. I hope Apple continues development on it. I would love to see it eventually become a full-fledged 3D development environment.

  • It is important to begin by noting that I am not a vegetarian—I eat meat.

    Still, there is something undeniably strange about eating meat nowadays. I think it stems from the fact that most of us are completely disconnected from the production of the meat we consume.

    Plus, we eat a lot more meat than ever before.

    In 1960, Americans ate an average of 28 pounds of chicken, per person, each year. In 2022 it was more than 100 pounds.

    More than 70 billion chickens are slaughtered annually. To put that number in perspective, it is estimated that 100 billion humans have ever existed throughout the entire life of our species.

    Again, I don’t mention all of this to be preachy or judgmental—I eat meat and I don’t raise the meat that I eat myself. All of this is to say that there is a serious cost to the ever-increasing quantity of meat that most of us consume.

    Jonel Aleccia & Laura Ungar, AP News:

    For the first time, U.S. regulators on Wednesday approved the sale of chicken made from animal cells, allowing two California companies to offer “lab-grown” meat to the nation’s restaurant tables and eventually, supermarket shelves.

    […]

    In a recent poll conducted by The Associated Press-NORC Center for Public Affairs Research. Half of U.S. adults said that they are unlikely to try meat grown using cells from animals. When asked to choose from a list of reasons for their reluctance, most who said they’d be unlikely to try it said “it just sounds weird.” About half said they don’t think it would be safe.

    […]

    It could take a few years before consumers see the products in more restaurants and seven to 10 years before they hit the wider market… Cost will be another sticking point… Eventually, the price is expected to mirror high-end organic chicken, which sells for up to $20 per pound.

    There are still big challenges that need to be solved before cultivated meat can become mainstream. Consumer acceptance and cost are both particularly salient. At least now regulatory hurdles can be checked off of that list.

  • Billy Perrigo, Time:

    The CEO of OpenAI, Sam Altman, has spent the last month touring world capitals where, at talks to sold-out crowds and in meetings with heads of governments, he has repeatedly spoken of the need for global AI regulation.

    But behind the scenes, OpenAI has lobbied for significant elements of the most comprehensive AI legislation in the world—the E.U.’s AI Act—to be watered down in ways that would reduce the regulatory burden on the company

    The Time article above contains the entirety of a previously unreleased document OpenAI wrote for E.U. officials.

    Here is the thing, I agree with Altman that E.U.’s AI Act was too broad. That isn’t where I take issue with this.

    The problem is that Altman has been spending his time publicly lobbying for regulation when it would hurt his competitors while privately pushing for the opposite when it would affect him.

    Again, an obvious push for regulatory capture.

    OpenAI has pledged not to compete with other companies in the event they get close to surpassing their capabilities. The fear being that competitive “race dynamics” would lead to unsafe development and deployment practices.

    From OpenAI’s founding Charter:

    We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project.

    This was again emphasized in the GPT-4 technical report:

    One concern of particular importance to OpenAI is the risk of racing dynamics leading to a decline in safety standards, the diffusion of bad norms, and accelerated AI timelines, each of which heighten societal risks associated with AI.

    There is the straightforward way to honor this promise: keep chugging along for now and, if a company later comes along and laps OpenAI, give up the fight fair and square.

    I think Altman’s actions these past few months has demonstrated he is taking another, less charitable, approach: if OpenAI can bog down competitors with arduous regulations they will never have to give up their lead.

    So sure, you could say that this is consistent with their stated views on AI safely—they naturally trust their own development safeguards more than they trust others—but it is also hypocritical and dishonest.

  • Paul Ford:

    Dad wrote opaque, elliptical, experimental works of enormous profanity… The upshot was 70 years of writing on crumbling yellow onionskin, dot-matrix prints with the tractor feeds still attached, and bright white laser output, along with more than 10,000 ancient WordPerfect files and blog entries, including many repeats. Now all mine to archive.

    […]

    After I parsed and processed and batched his digital legacy, it came to 7,382 files and around 7 gigabytes.

    The sum of Frank took two days and nights to upload to the Internet Archive

    […]

    In time, we all end up in a folder somewhere, if we’re lucky. Frank belongs to the world now; I released the files under Creative Commons 0, No Rights Reserved. And I know he would have loved his archive.

    Visit Frank on the Internet Archive.

  • I am surprised that, at least according to all reporting I’ve seen, Apple isn’t planning to build haptic integrations between their existing Watch product and upcoming Vision Pro headset.

    It seems like such a great way to offer additional value to owners of both products.

    I have no doubt Apple could create an uncanny sensory feedback experience for hand tracking using Watch haptics alone. For example, think about the haptic feedback you get when using the Digital Crown to scroll through a list on the watch. Imagine applying that to the act of turning a virtual dial in AR.

    Ever since 2015, the trackpads in all of Apple’s laptops have been solid state—the “click” is simulated, there are no moving parts. They have arguably never felt better. In a sense, they are better than the genuine thing. More real.

    Adding physicality to the VisionOS interface will both ground it in reality and deepen immersion while providing an instant level of familiarity to those using the new platform.

  • § I harvested our first sugar snap peas and strawberries of the season. There isn’t much, this early in the season, but eating something you’ve grown yourself is always a great feeling.


    § While sawing down tree branches a few weeks ago I set aside a couple of the larger branches, intending to use them to make a cat tree.

    I finally got started building it this week!

    I chose one of the branches, stripped off all of its bark, and wrapped the base in a thick green jute. The cats have already taken some interest in it.

    Unfortunately, the whole thing is attached to a 16” round base plate that is, I’ve come to find out, nowhere near sturdy enough. It looks like I’ll need to learn how to pour concrete to make it new base for it all.


    § I saw two movies, Blackberry and Tetris, which feels like two different takes on the same general premise: ”Follow a scrappy upstart technology company as they risk everything to bring their vision to life.”

    There was something endearingly quirky about Tetris that I found really fun. Instead of using chapters, the film was broken out into “levels” with funky pixel art animations preceding each one. In comparison, Blackberry was conventional—a modest retelling of an interesting story rather than an interesting retelling of a modest story.


    § …Speaking of blackberries

    The blackberry bush I planted last year is doing amazingly well. I never expected it to come back this spring with such a vengeance. It is already at least eight feet tall and is covered in dozens of tiny white blossoms.

    It is doing so well, in fact, that I decided to buy another raspberry bush after loosing two of them to a mysterious disease last summer. Fingers crossed it fairs better this year.


    § I would love to play an alternative “roguelike” version of The Depths in Tears of the Kingdom.


    § Links

    § Recipes

    • Gluten free pierogis
      • Amazing. One of the easiest gluten free doughs to work with.
    • Sauteed morel mushrooms
      • Ever since unexpectedly getting way into mushrooms this spring, I have been looking forward to trying morels for the first time. Well, I was so excited when I finally found a bag of freshly picked morels at my usual grocery store. After finally getting the opportunity to try them it was, overall, a rather upsetting experience. Take a look at the third paragraph under the “cleaning morels” heading above if you are curious to know why—gross!
  • I was excited when StabilityAI—the company behind Stable Diffusion—launched StableLM, their open source language model with a commercially permissive license. I was convinced it would become the new hub for open source community development.

    Prior to the announcement, developers had coalesced around Meta’s LLaMA model which had always been a somewhat tenuous situation. It was initially only available to select researchers before it was leaked to the public. Since then, the company hasn’t been entirely clear in its messaging. On one hand, Mark Zuckerberg has expressed a desire to commodify generative AI through open source contributions. On the other hand, they have been issuing DMCA takedown requests for seemingly innocuous projects that incorporate LLaMA.

    Now, two months after StableLM’s launch, it has become clear how difficult it is to redirect inertia. The open source community has continued contributing to LLaMA and development on StableLM has stalled. As I write this, there have been no updates to the StableLM code since April.

    Well, it seems like Meta might be on the verge of announcing a successor to LLaMA with a more permissive license, allowing for commercial use.

    Sylvia Varnham O’Regan, Jon Victor, and Amir Efrati, The Information:

    Meta is working on ways to make the next version of its open-source large-language model—technology that can power chatbots like ChatGPT—available for commercial use, said a person with direct knowledge of the situation and a person who was briefed about it. The move could prompt a feeding frenzy among AI developers eager for alternatives to proprietary software sold by rivals Google and OpenAI.

    Although Meta didn’t originally indend for the open source language model community to form around their models, they may as well come out and fully embrace it. It is their best chance at disrupting Microsoft and Google dominance.

  • Watching language model tooling slowly mature, it is interesting to see a progressive constraining of capabilities.

    Historically, programming languages have become more abstract (”higher-lever”) over time:

    Assembly → C → Python

    With language models, we may have arrived at the highest possible level of abstraction—natural language—and now we are beginning to wrap back around the other way.

    A high level of abstraction is great in that it lowers the barrier to entry for programming but it comes with the cost of increased ambiguity. Sure, your compiler can now try to guess your intentions but that doesn’t mean you would always like it to do that.

    Even more important is the fact that language models are non-deterministic. That is, each successive time you run your “program” you might receive a different output.

    This is a huge problem, almost a non-starter when it comes to integrating LLMs into traditional programming pipelines. That is why so much research has gone into making LLMs reliably output a more constrained set of tokens that can be validated according to a predetermined schema.

    JSONformer, GPT-JSON, and Guidance are all examples of prior work along these lines.

    Well, earlier this week OpenAI announced a new API endpoint that points to models they finetuned for exactly this purpose.

    OpenAI:

    Developers can now describe functions to gpt-4-0613 and gpt-3.5-turbo-0613, and have the model intelligently choose to output a JSON object containing arguments to call those functions. This is a new way to more reliably connect GPT’s capabilities with external tools and APIs.

    I can’t wait to see what people are able to accomplish using these new capabilities.

  • Byron Tau and Dustin Volz, The Wall Street Journal, The Wall Street Journal:

    The vast amount of Americans’ personal data available for sale has provided a rich stream of intelligence for the U.S. government but created significant threats to privacy, according to a newly released report by the U.S.’s top spy agency.

    Commercially available information, or CAI, has grown in such scale that it has begun to replicate the results of intrusive surveillance techniques once used on a more targeted and limited basis, the report found.

    Intelligence agencies don’t need to request a warrant for a piece of information if they can purchase it from public sources instead.

    The proliferation of data brokers who specialize in compiling and selling sensitive information has only exacerbated this problem.

    Quoted directly from the report:

    Under the U.S. Constitution… CAl is generally less strictly regulated than other forms of information acquired by the [intelligence community (IC)], principally because it is publicly available. In our view, however, changes in CAl have considerably undermined the historical policy rationale for treating [publicly available information (PAI)] categorically as non-sensitive information, that the IC can use without significantly affecting the privacy and civil liberties of U.S. persons. For example, under Carpenter v. United States, acquisition of persistent location information… concerning one person by law enforcement from communications providers is a Fourth Amendment “search” that generally requires probable cause. However, the same type of information on millions of Americans is openly for sale to the general public. As such, IC policies treat the information as PAl and IC elements can purchase it.

    I understand that it would be foolish to expect intelligence agencies to abide by a stricter set of data privacy rules than civilians. Still, I don’t feel great about public money being used to support and encourage data brokers.

    In the end, you can’t sell what you don’t have. This report reinforces my view that end-to-end encryption should be the only acceptable solution for storing personal information.

  • OpenAI:

    In recent years, large language models have greatly improved in their ability to perform complex multi-step reasoning. However, even state-of-the-art models still produce logical mistakes, often called hallucinations.

    […]

    We can train reward models to detect hallucinations using either outcome supervision, which provides feedback based on a final result, or process supervision, which provides feedback for each individual step in a chain-of-thought… We find that process supervision leads to significantly better performance, even when judged by outcomes.

    This technique was evaluated using questions from a large mathematics dataset. This is an important caveat as math is a domain that is well-versed in the practice of “showing your work.” Presumably GPT-4’s training corpus includes many instances of people walking through math problems step-by-step. The relative preformance of process supervision when it comes to questions from other domains is still unknown.

  • Facebook has released another open source model as they work to commodify generative AI.

    Facebook Research:

    We introduce MusicGen, a single Language Model (LM) that operates over several streams of compressed discrete music representation… we demonstrate how MusicGen can generate high-quality samples, while being conditioned on textual description or melodic features, allowing better controls over the generated output.

    I was amazed by Google’s MusicLM model earlier this year. Facebook provides side-by-side comparisons here that demonstrate MusicGen is clearly superior. It isn’t an enormous leap, but audio generated using Google’s model has a distinct “compressed” quality that is greatly diminished in Facebook’s implementation.

    More importantly, MusicGen is completely open. Google only recently allowed beta testing of MusicLM through their AI Test Kitchen App and, even so, generations are limited to 20 seconds. Here, Facebook released both their code and model weights on GitHub and spun up a Colab notebook demo.

  • § I can see fruit beginning to develop on our blackberry, strawberry, and snap pea plants. I can’t wait for the rest of the garden to fill out—I planted two more tomato seedlings and six different types of peppers.

    We also picked an almost burdensome amount of lettuce. Just as I was starting to ready myself for a week of enormous salads, Caroline had the ingenious idea of using them in our long-neglected juicer. That quickly lightened our load.


    § Speaking of my inability to stick to reading one thing at a time, I started reading The New House by David Leo Rice after seeing James Reeves’ passionate recommendation.

    Frustratingly, I can’t find a digital copy of the book anywhere so not only am I jumping around too much, in this case I don’t even get to use the reMarkable tablet to help.


    § I tried to savor the new season of I Think You Should Leave and still finished it in less than a week. My favorite sketch was The Driving Crooner.

    Overall, I feel like this season was maybe slightly worse than the previous two. I still highly recommend watching it, though.


    § I thought this might be the first year that I would skip the iOS beta. I made it two days before installing iOS 17 on my phone. Playing around with the updated autocorrection engine has been interesting but, overall, there is really not much to see.


    § Since its introduction at WWDC last year, I haven’t seen much mention of Apple‘s RoomPlan API. Try it out if you want a taste of the technology behind the upcoming Vision Pro headset. You can watch your iPhone construct an accurately scaled 3D model of a room—in real time—with each architectural element and furniture item segmented and tagged. It is shockingly impressive.


    § Links

    § Recipes

    We had a lot of quail eggs we needed to use so Caroline and I made a big batch of lemon curd with the yolks and angel food cake with the egg whites. The lemon curd had a bit of a grainy texture. I am not sure if that is due to the recipe or a just a consequence of not having a proper double boiler. Regardless, it tasted great, especially as a part of lemon curd ricotta pancakes.

  • It is official: Cortana is dead.

    Microsoft:

    We are making some changes to Windows that will impact users of the Cortana app. Starting in late 2023, we will no longer support Cortana in Windows as a standalone app.

    […]

    We know that this change may affect some of the ways you work in Windows, so we want to help you transition smoothly to the new options. Instead of clicking the Cortana icon and launching the app to begin using voice, now you can use voice and satisfy your productivity needs through different tools.

    They go on to pitch their new GPT-powered “Copilot” features.

    Watch out Google Assistant, you’re next.

  • AI generated video still lags behind AI imagery by quite a large margin. Still, some artists are forging ahead and exploring what is possible with the tools available today.

    Will Douglas Heaven, MIT Technology Review:

    The Frost is a 12-minute movie in which every shot is generated by an image-making AI.

    […]

    To make The Frost, Waymark took a script written by Josh Rubin, an executive producer at the company who directed the film, and fed it to OpenAI’s image-making model DALL-E 2… Then they used D-ID, an AI tool that can add movement to still images, to animate these shots, making eyes blink and lips move.

    Some of the characters almost look like Kara Walker’s paper cutout silhouettes, others have more detail, as if they were assembled out of various magazine clippings, none of them look alive. There is a pervasive surreal sense that everything in The Frost’s world has been reanimated. It is weird and new and fascinating. Highly recommended.

  • As the dust settles on Apple’s Vision Pro headset announcement, critical reactions are mostly all about the same thing: wearing big goggles around other people is weird and no one is going to want to do it.

    Ben Thompson articulated this general critique quite clearly:

    I didn’t even get into one of the features Apple is touting most highly, which is the ability of the Vision Pro to take “pictures” — memories, really — of moments in time and render them in a way that feels incredibly intimate and vivid.

    One of the issues is the fact that recording those memories does, for now, entail wearing the Vision Pro in the first place, which is going to be really awkward!

    …it’s going to seem pretty weird when dad is wearing a headset as his daughter blows out birthday candles

    This isn’t the first time we’ve had to contend with weird new technology. Matt Birchler offers the two most likely paths the Vision Pro might take:

    The question is, what’s this going to be like:

    1. AirPods, which many people thought looked silly at first but then people got used to them.
    2. Camcorders, which took decades to go from kinda awkward to mainstream over decades and massive advances in the tech.

    When AirPods first launched, I remember how viscerally strange I found them. Now, not only do I use AirPods religiously, I don’t even remember why I thought they were so weird in the first place. If Apple can pull that off again, we will be in for a wild next few years.

subscribe via RSS