Michael W Lucas @mwl@io.mwl.io:
Inspired by a discussion elsewhere:
I've been on the Internet since 1987, started a career building the commercial
Internet in 1995, and have spent the last 25 years writing books about how to build
foundational Internet infrastructure. I've consulted for and worked with any number of
dot-coms, and the one lesson I've gotten over and over again?
The Internet's business model is betrayal.
We have no smart lights. No voice assistants. No Alexa or Siri. No video
doorbell. Our thermostat and appliances constantly complain about their lack of Internet.
None of this stuff is safe.
The Internet tech I do use? A desktop PC. Email on my phone is for travel
only: airplane tickets, hotel reservations, hockey and concert tix. Location on my phone?
Nope, we use a dedicated non-networked GPS in the car. The microphones are off.
How can a light bulb betray me? I don't know. I do know that the vendors have
put a LOT of thought into it, though, and I can't out-think all of them.
If GenX would stop using "f/u" to mean "follow-up" in email subject lines, I wouldn't complain.
KJ Charles has a
Bluesky thread about the AI powered evolutions in the "book club" scam.
Wishlisting cameras to replace our Ring system, and holy shit marketing departments are failing. Looks like Reolink is the leader, but digging through each product description and trying to figure out how these things fit together is a total pain in the ass.
User stories, folks. Use them.
Max Leibman
@maxleibman@beige.party
Oh, surewhen *the company* automates my job and keeps collecting the profits,
that's "innovation," but when *I* automate my job and keep collecting a paycheck, that's
"timeclock fraud."
Assaad Abousleiman on LinkedIn
The last decade of software was built to capture attention.
The next decade will be built to give it back.
I don't agree with his "plausible sentence generators are the future" conclusion that the
rest of this essay goes on to conclude, but I like the strong opener. We have a decade or
so of computing that's actively user hostile, and we need software which we can trust,
which is on our side.
I do agree with two points:
First, that we need to treat the computing developments of the last decade or decade and a
half as actively hostile. Google, Facebook, Apple, Microsoft, et al all have gone
completely over from enabling us to finding ways extracting every possible bit of value
from us.
Built in applications on our platform have gone from utilities to worthless for our own
data unless we cave to demands for additional subscription payments. From media players to
just using our own damned hard drives, it's getting harder and harder to use our own data,
the focus becomes ways to sell us mediated subscriptions.
We're no longer in control of what we see, instead we're being fed information that serves
the wants of capital in ways that emotionally triggers us, with automated measures of the
efficacy of those information feeds. Our conversations with our friends and our
communities are being mediated by hostile forces.
In the social media and email tools of the '90s, we had the ability to build incredibly
nuanced filters to help us automatically control what information we were going to let the
assholes impose on our lives. Now, the best of these tools (things like Mastodon on the
Fediverse) give us simple yeah/nay keyword filtering.
Second, that this software needs to help us automate processes that we currently do
manually. As operating systems have moved from the command-line to GUI, we've lost the
physical artifacts of process. I think it's worth diving deeper into this.
Every use of an LLM to write code is an acknowledgement of the failure of the programming
languages that it's implementing code in. We can describe the process well enough that a
lossy plausible sentence generator can guess at what we meant, why can't we make the
language express that same meaning unambiguously, in ways that are accessible?
We need a move forward in computing language design to give us languages with grammars
flexible enough that people can express, and we can iteratively guide them into a
repeatable formal definition that they understand, and that the computers can
deterministically execute.
Finally, we need business models, and computing tools, that serve us, rather than those
who are looking to further exploit us.
Thinking about those videos that came out of the occupation of Iraqi cities of US forces
shooting up commuters and people just trying to get around and live their daily lives, and
how somehow our political process decided that it was a good idea to bring that chaos to
domestic policing.
https://www.mprnews.org/story/...nas-coffee-ice-car-crash-st-paul