Quantcast
Channel: Tedium
Viewing all 987 articles
Browse latest View live

The Internet Will Be Televised

$
0
0
The Internet Will Be Televised

Today in Tedium: Terrible product ideas are a dime a dozen, but what about ideas that are fascinating, and perhaps executionally sound, but conceptually flawed? How often do they come about? And how often is it that they stick around the market for 17 years, despite fairly limited public interest? Such is the case of WebTV, the set-top device that allowed you to surf a television-optimized version of the web. On September 18, 1996—20 years ago this week—a startup firm released a device that meant to bring the internet to the living room. It was a bet that people didn't want to use a computer set up at a desk to get online—a bet that turned out to be half-right, because ultimately people didn't want to be tethered at all. Today's Tedium ponders what we learned from the great, expensive WebTV experiment. — Ernie @ Tedium

"We won't tell you that TV will never be the same, because it will. It's you who will be different. Just watch."

— An introductory line spoken during a media roadshow-style event ahead of the launch of WebTV, on July 10, 1996. Roughly a year earlier, Steve Perlman and two other co-founders launched the company. During the presentation, Perlman, a onetime Apple employee who invented the firm's Quicktime technology that is still in heavy use today, called the WebTV reveal historic. "We've repackaged the internet to be designed for this type of medium," he said.

The Internet Will Be Televised

How hard was it to sell television-optimized internet to the public?

Despite being heavily hyped, WebTV struggled to gain traction during its early years.

The technology itself, first conceptualized by a startup of the same name and distributed by tech giants Sony and Philips, was impressive and forward-thinking at the time, taking advantage of a budding mainstream interest in the web as well as the fact that large swaths of the public (read: baby boomers and older) were not as adept with computers as their kids.

But the web was simply not designed for reading 10 feet away from the screen, despite changes made by Perlman and other developers that specifically focused on television readability. They made the text bigger, worked hard on the typography, and made it easy to scroll through links with a remote control. But it was still a slog.

Usability expert Jakob Nielsen, while impressed at the level of readability WebTV's developers got out of the device ("you have to see it to believe," Nielsen said of how clear the text showed up on a high-quality TV of the era), ultimately found the experience too limited for everyday use.

"WebTV achieves a very high level of usability given its design constraints," Nielsen wrote in 1997. "Unfortunately, the constraints are so severe that even this great design ultimately fails to provide an optimal Web user experience."

Perhaps one telling sign of the challenges of selling the public on WebTV comes from the roadshow clip. Cofounder Bruce Leak, as he showed off the capabilities of the system while searching for the television show Friends, had to stall a bit when the 1996-era website for NBC's Friends, a fairly primitive site by today's standards, struggled to load. The issue was two-fold: Modems are slow, but the page, additionally, was not optimized to work effectively on WebTV. There were a lot of pages like that at the time.

The Internet Will Be Televised

(Because I was curious, I tried out the WebTV developer tools to see how this horribly primitive system could handle Tedium's front page. As you can see above, the results are sexy.)

The solution to this issue, of course, is not unlike what we see happening today on mobile platforms: responsive websites, dedicated apps, and progressive approaches designed to take advantage of the small screen so that content is readable anywhere. But unlike mobile phones, where the user bases simply couldn't be ignored, WebTV for much of its history was the only company doing any of this stuff, and they didn't have enough subscribers to be taken seriously, so the idea of selling developers on creating a television-formatted version of their website was a nonstarter.

"It was a walled-garden dream of developing a WebTV version of the internet, just as AOL was doing at around the same time, and as Netscape attempted with its proprietary browser extensions," recalled Brad Hill, a marketer of the service at the time of its launch. "There was some uptake on the part of publishers, but most sites in the exponentially expanding web simply allowed WebTV to reformat their pages for better or worse. Sites were often ugly through the WebTV lens, and sometimes unusable."

But those obvious weaknesses didn't stop Microsoft from spending $425 million to purchase WebTV, less than eight months after its public launch. Maybe they saw something the public didn't.

The Internet Will Be Televised

Five interesting facts about WebTV

  1. Thomas Dolby, he of "She Blinded Me With Science" early-MTV fame, was involved in creating the notable blips for the WebTV through his company, Headspace. Dolby's company at the time was pushing forth an early music-streaming technology called RMF, meant to be complementary to MIDI and MP3 formats. WebTV was one of the few devices that supported it natively.
  2. For a brief period of time, WebTV's encryption was considered so tough that the U.S. government classified it as munitions, preventing the device from being released in Europe and other countries because it was considered a weapon of sorts. The laws eventually were changed, because (as anyone could figure out by using it) WebTV was not a weapon.
  3. WebTV, as it turned out, had at least one virus. In 2002, a man named David Jeansonne emailed WebTV users a malicious program that claimed to allow users to change the on-screen colors, but in reality actually changed the device's dial number to 911. Just 10 people were affected, but Jeansonne received prison time for the malicious action.
  4. Like the Prevue Channel, WebTV did have a community focused on hacking the device. Among the sneakier users of the platform was Matt Squadere, who has periodically been updating a site called HackTV since 1998. The site features info about secret areas on the platform and hacks to get games to work on a WebTV device. His actions actually got him banned from the service back in 1999.
  5. WebTV was not designed for video games, but at least one video game console supported WebTV. Owned by Microsoft at the time, the WebTV team helped get a Japanese version of the service on the Sega Dreamcast.

250k

The number of subscribers to WebTV in February 1998, according to the New York Times. In comparison, AOL had 11 million subscribers at the end of that March, according to CNN Money. By December 1999, CNN reported that WebTV subscribers had topped 900,000.

The Internet Will Be Televised

Why Microsoft mostly left WebTV (also known as MSN TV) alone for 16 years

There was a period of time when WebTV could be considered something of a hit, if a modest one, and it was within the first year or two after Microsoft made its eye-popping purchase of the firm, which in comparison to Yahoo's $3.57 billion purchase of Geocities in 1999, wasn't too bad, considering.

But the thing is, the internet was moving too fast for a set-top device of this nature. While WebTV got an upgrade soon after its purchase from Microsoft that added some fairly interactive video features that predicted the launch of TiVo a few years later—as highlighted by this William Shatner-starring WebTV commercial from 1998—its web engine quickly found itself out of date, unable to keep up with the speedy changes of browsers like Netscape, Internet Explorer, and later, Firefox.

WebTV's proprietary setup was part of the problem, as it was designed from the ground up to offer a television-optimized experience. This meant that it couldn't easily take advantage of the open-source offerings of the era, and that porting Microsoft's comparably-better Internet Explorer would be a challenge. (It also didn't help that screen resolutions were quickly improving on computers, making the low resolution of TV sets even more of an issue.)

The Windows giant ultimately shifted the device's gears. They changed the name of the platform to MSN TV in 2001 and started giving away the devices to MSN members, convincing them to stay with the service (hey, they were desperate). Then, when it came time for an upgrade in 2004, the company released the MSN TV 2 with innards that were closer to the original Xbox than to the original WebTV. It had more robust media player features, along with an Intel Celeron processor. Moreso than the WebTV was, it was essentially a computer that worked on your TV.

Ron Goldin, a designer for Microsoft during the era, considered the interface. design a major upgrade from the WebTV.

"The complete overhaul of the design for a then 4:3 resolution TV was ground-breaking in terms of its simplicity in navigating non-linear grids with no pointing device, as well as a custom keyboard for more involved tasks like composing emails," Goldin wrote on his website.

The new device utilized Internet Explorer, much closer to what people were actually using on their desktops, but the device still wasn't ideal for surfing the web—in part because TV screens by this point still hadn't caught up with their desktop competitors.

"The web browsing experience will depend on the sites that you visit," an UberGizmo review from 2006 stated. "Because MSN TV is using a somewhat low resolution, some websites won’t look the way they were intended to. Problems can go from scaled-down images to missing navigation."

In a lot of ways, the MSN TV 2, which was distributed by RCA, set the stage for all the devices we use on our TVs now. But it didn't get much in the way of upgrades after 2004—it was left to die on the vine.

But in 2013, more than 15 years after originally purchasing WebTV in 1997, and nine years after the release of the MSN TV 2, Microsoft said enough. That's when they finally killed the a set-top device intended to take over the world. It only took over a small part of it.

Don't shed any tears for WebTV. In a lot of ways, it's still with us.

(Also: Don't shed any tears for Steve Perlman. He's a serial entrepreneur who developed the also-defunct cloud-gaming service OnLive and an impressive-sounding cellular technology called pCell. He's also at the center of a legal battle around some face-scanning technology that has been used in some of your favorite big-budget action movies. The latter issue is a headache, but he's doing OK, even if WebTV didn't turn him into Steve Jobs 2.0.)

Ultimately, Microsoft found a better vessel for the WebTV idea in its Xbox console. While web browsing was not a prominent part of the Xbox or the Xbox 360 (though Internet Explorer was eventually installed on the 360), the Xbox One has stellar web-browsing capabilities that actually sort of make sense on a 60-inch screen.

Part of the reason for this is that a number of members of the Xbox team were alums of the WebTV service, and their fingerprints were all over the 360 in particular. (It helped that those WebTV alums were also involved in another noble experiment in interactive set-top boxes, the 3DO.)

But a bigger shift may be in the television itself. While fast internet streams are more common today; it's important to note that nearly every TV sold in stores these days relies on HD technology that's generally good enough to work as a computer monitor in a pinch. Sometimes, the screens are even better than HD.

But ultimately, people don't primarily use the Xbox One to surf the web and check their email, and they don't really use the Apple TV or Roku for those reasons, either. These devices are built for apps—big apps focused on games, small apps focused on video, but, no matter how you shake it, apps. And apps fully optimized for the interface, ultimately, were what the public really wanted from their TV-based online experiences.

The web is nice, but we prefer it within a foot or so of our faces.


We Screwed Up

$
0
0
We Screwed Up

Today in Tedium: Despite the name of this internet publication, I've never mentioned what might perhaps be the most tedious part of the world's modern existence, the most basic of building blocks that connects every one of our devices, manufacturing processes, and likely even the chair you're sitting in right now: The screw. It strikes me that screws are the ultimate example of objects that hide under our noses that we never think about. But I was thinking about them a lot last night when I tried to screw a nut around a screw and misaligned it so annoyingly that it took a lot of physical might to unscrew that screw. My question for today's Tedium: Why don't we talk about screws more often, and how can I start that conversation? By the way: No nails in this issue. This is all about screws, guys—screws that you should keep away from air mattresses. — Ernie @ Tedium

69

The number of screws in a full screw kit for the iPhone 6S sold on Amazon. (Yes, I counted. It was tedious.) No word on how many screws the iPhone 7 has, but the iPhone 6 had just 52 screws, according to one online seller. Screws have periodically been a source of controversy for Apple, particularly when the company introduced pentalobe screws with the iPhone 4 at a time when pentalobe screwdrivers were very rare. The screws were seen by repair experts, such as iFixIt's Kyle Wiens, as a way to prevent users from repairing their own devices.

We Screwed Up

Five different varieties of screws that weren't popularized by a guy named Phillips

  1. The square-headed Robertson Screw predated the Philips screwdriver by about 30 years, and for decades it was more common than the Philips in the U.S., which eventually won out not due to a more efficient design, but because of licensing drama. See, Henry Ford wanted inventor P.L. Robertson to license out his screw design. Robertson refused, and that led the design to lose out to the Phillips screwhead in the U.S. market. The Robertson screw is still popular in Canada, however.
  2. The hex socket set screw, named for its six-sided hexagon design, isn't named after a person, but its corresponding tool is. The Allen wrench, named for William G. Allen, has existed for more than 100 years. The reason the wrench is named for Allen rather than the socket? Because the hex screw predates the Allen wrench by a few decades.
  3. The Bristo screw, which is now called the Bristol Spline Drive, has an unusual spline-driven circular design that is claimed to be excellent at producing torque. The invention, which initially had an Allen wrench-style design, dates to 1911, with the invention credited to a guy named Dwight S Goodwin.
  4. The Torx screwdriver, which first appeared in 1967, introduced a star shape that has become fairly common in certain technical uses, such as cars, bikes, and consumer electronics. Unlike a Phillips screw, it's designed not to fall out, and at first, to prevent people from unscrewing it, the screwdrivers were proprietary. (It's similar to Apple's pentalobe screw, except pointed instead of rounded.) They also came in handy for guns. "Before Torx, which appeared in 1967, all firearms relied on slothead screws, which were designed to make ordinary shooters miserable and enable gunsmiths to drive around in Bentleys," Field & Stream's David E. Petzal wrote.
  5. Perhaps the most interesting and unusual screw design in the past few decades is the Outlaw Fastener, a multi-tier screw that is akin to combining an Allen wrench with a three-layer cake. It was the subject of a successful Kickstarter campaign in 2013. The design, if fairly advanced, isn't totally new; it appears to be the direct descendant of the Uni-Screw, a design that dates back to the 1960s.

"It is well known to persons who use screws that if the nicks are narrow and shallow it is difficult to drive the screw without the screw-driver slipping out of the nicks, and if the nicks are wide and deep to afford a good gripe, the head of the screw is weakened, and the screw-driver is liable to slip out sidewise and deface the finished surface of the work, and if the screw-driver is the same width as or wider than the head of the screw, the countersink work is liable to be defaced, and the angles of the screw-driver are often broken."

— Engineer John Frearson, in his patent application for a screw drive head that used a cross style very similar to the more commonly known Phillips head screwdriver. The Phillips screw, however, has a slight curvature in the center, which makes it so that when the screw is in all the way, the screwdriver would inevitably fall out (or if you're me, you would continue twisting anyway until you've fully stripped the metal and made the screw useless). Frearson screws, a 20th-century innovation based on John Frearson's 19th-century work, are popular with boating types.

We Screwed Up

(Clint Budd/Flickr)

The U.S. debated screw threads for more than 30 years

The history of the screw is surprisingly diverse and unexpected.

It dates back thousands of years, to ancient Greece, when it's claimed Archytas of Tarentum invented an early version of the device. It has the fingerprints of Leonardo da Vinci, and was a key part of the Industrial Revolution.

But in the midst of the peak of innovative efforts around the screw, one that saw the creation of both the Robertson and Phillips screws, the U.S. government … well, they spent a lot of time researching screw threads.

(For people who don't regularly screw stuff in, the thread is the pointed metal line that twists around the bolt. It's what makes a screw a screw and not a peg.)

In 1918 Congress passed a law establishing an organization called the National Screw Thread Commission, with the goal of ascertaining consistent standards for screws. The goal of this effort, which you might guess given the timing of the law's passage, is military-related: As you might guess, the military uses a lot of screws, and inconsistencies were apparently bad enough after World War I that Congress had to do something about it.

John Q. Tilson, a Connecticut congressman, argued that the measure was necessary due to the problems a lack of consistent screw thread were creating. He also made the case for businesses—who he argues also will benefit from screw compatibility.

"Private manufacturers, however, desire this done just as much as everybody else," he said, according to The Journal of the Society of Automotive Engineers. "They would like to have the standard of tolerance for screw threads all over the United States.

The law, of course, passed, and we had the National Screw Thread Commission, perhaps the most obscure bureaucratic organization to ever exist.

But it had a perfectly good reason to exist. A 1926 New York Times article about the commission highlighted the 1904 Baltimore fire, in which fire departments from other major cities came in to help. Unfortunately, the other cities had hoses that were incompatible with the screws used by Baltimore, making their help useless.

The article noted that the government was working close with the U.K. on the issue, and differences between those two countries did a lot to underline the problem:

There is, however, a fundamental difference in the angle of the thread of the two systems. This is 60 degrees for the American and 55 degrees for the British thread. Still another difference is that the American thread has flattened crests and roots, whereas those of the British thread are rounded.

We Screwed Up

(via Pixabay)

This difference surfaced essentially because the U.K. had created its own standard, the Whitworth Thread, but an American thought he could do things better. In 1864, William Sellers introduced the screw design for the American market that borrowed inspiration from Sir Joseph Whitworth's design, but pitched a new path forward.

Now, the U.S. was trying to convince the rest of the world that Sellers' design was the way to go. And it took a long time to sell them on the idea. The National Screw Thread Commission was active for three decades, partly because of all the details to go over, and partly because another war threw a wrench in the mix. (We haven't even talked about wrenches!)

What eventually ended the commission's long reign of terror was a deal with Canada and the U.K. to embrace the 60-degree screw thread, something that's called the Unified Thread Standard. (The Second World War, of course, highlighted just how big a problem the different screw standards were.)

At the same time the U.S. was working out its thread standard, the rest of the world was doing their weird Metric System thing, and around the same time as the launch of the Unified Thread Standard, the International Organization for Standardization was getting off the ground, and of the first tasks they took on was standardizing the screw. The good news is that the screw design they chose screws in at the same angle as an American screw. The bad news? They set the design on the metric system, when the U.S system was based on inches.

So much for a unified system, though Americans don't seem to mind it. (It would be hilarious, though, to see shoppers at Lowe's confused if one day they picked up metric-standard screws rather than the American ones they're used to.)

In 1960, when the U.K. moved over to the metric system, the Americans lost an ally on the screw thread front.

So it's basically us and the Canadians, and the Canadians are using those weird square screwdrivers anyway.

In a lot of ways, the perils of the screw highlight some important modern debates we're having with technology.

Specifically, regarding the 3.5mm headphone jack, that thing Apple is apparently trying to kill.

Let's put the company's argument in screw terms: The headphone jack, as it currently stands, is the musical equivalent of a Phillips screwdriver, better than what we originally had (flathead screws) but also greatly lacking in terms of what could be (no screws whatsoever—they're getting rid of the screws over at Apple).

They'll still give you an option for screwing stuff in for now—they're using pentalobe screws, so you'll have to use an adapter so your Phillips screwdriver still works. But in the long run, you better plan to throw away your screwdrivers altogether, because where they're going, you won't need them.

That argument is not a great one, but it's understandable why they might want to change things—clearly a Torx screw is better than a Phillips, but we've already committed to the standard.

Apple's counter-argument: Screw the standard! Why not give people something better that improves their lives?

And as we know, Apple has never been one to follow standards. They invented a new screw just to piss people off.

The Big Data Jukebox

$
0
0
The Big Data Jukebox

Today in Tedium: Discovering new kinds of music is easier than ever, something we can credit Napster for—along with its later legal followers, particularly Spotify, Pandora, and Apple Music. (Oh, and eMusic. That site was great back in the day.) But having all this music at our fingertips was only part of the equation—we needed to be armed with information. I think I realized that point during the early 2000s, when I found myself in the music section of a Barnes & Noble store. I was curious about a record by a band called Whiskeytown, but didn't really know much about it. Fortunately, there was a barcode-scanner kiosk that could help. I put the CD under the scanner, and it pulled up a fully-informed review of Ryan Adams' first brush with fame, along with some rad audio samples. I bought the album, and it became one of my favorites. The technology that enabled that kiosk? That's still with us today, and it makes Spotify and Apple Music even better than they already are. Today, we talk about AllMusic, one of the internet's first—and best—archival projects. — Ernie @ Tedium

3M+

The number of album entries in the AllMusic database. The database also has more than 30 million tracks—with much of this data collected over the past 25 years. In comparison, Wikipedia has approximately 5.2 million articles of all types in its English edition, and approximately 41.6 million articles overall. Not every song or album in the AllMusic database has the level of organization or depth of a Wikipedia article, but the database is nonetheless loaded with valuable data.

The Big Data Jukebox

How AllMusic highlights the value of the "long tail"

Perhaps it wasn't actually built for that purpose, but Wikipedia is often seen as a key source for all things pop culture. But AllMusic, which was actually built to classify popular culture, beat Wikipedia to the punch.

The site, which actually started in dead-tree form as a series of books, was active online in the '90s as a Gopher site, and then as a website, all years before Jimmy Wales and Larry Sanger had a somewhat similar idea.

But being a for-profit business, unlike the Wikimedia Foundation, adds some wrinkles to the mix. The All Music Guide, as it was originally called, has gone through a variety of iterations over the years, along with a number of owners.

The Big Data Jukebox

One of the earliest online iterations of All Music Guide. (via AllMusic's Facebook page)

Initially started in the early '90s by Michael Erlewine, an entrepreneur whose prior claims to fame include giving Iggy Pop his famous name (as he told Wired in 1994) and a computerized astrology business that dates back to 1978, the business has passed through a variety of hands over the years—enough that tracking the full history of AllMusic would probably be complicated without a chart.

To keep things simple, I'll explain the current setup of AllMusic, which has a fairly interesting split:

The actual database platform, including the album reviews that are distributed to a variety of services, is owned by a firm long known as Rovi, though it recently changed its name to TiVo based on a recent purchase you may have heard about. That database, due to the caliber of clients that license it, is immensely valuable and can be delivered to licensees in a variety of ways.

The AllMusic website, the site that the public is likely most familiar with, was spun off by Rovi a few years ago as part of the All Media Network, and now licenses the database content from its former corporate parent. The All Media Network also includes a number of other verticals, including the TV-watching-tool SideReel, AllMovie, and the video platform Celebified. (SideReel wasn't formed by AllMusic original parent firm, but otherwise complements it well.)

Zac Johnson, All Media Network's senior product manager, has been on both sides of the table: For years, he was an editor with AllMusic, and since the spinoff happened, he has played a key role in maintaining the AllMusic.com website.

(Fun fact: That album review that convinced me to get into Whiskeytown? Written by Zac. Weird coincidence, huh?)

Johnson says that, even as the data-licensing business picked up—with clients from throughout the past quarter-century of the music industry, including defunct platforms like CDNow and Borders—the website remained online as a free resource. AllMusic.com, as he puts it, became something of "the world's most elaborate business card," because it highlighted just how elaborate its database was.

"If an online store or streaming service wanted to know how much info they could license from All Media Guide," Johnson explained, "the folks there could say, 'Look at AllMusic.com and you'll see all of the reviews, ratings, biographies, catalog info and anything else you might want to help your customers decide which Miles Davis album to buy or what our editors think about the latest Strokes album.'"

Not that everything can get in—there's just too much stuff, and the editors have to pick and choose their battles as a result.

"The tough answer is that we only have so many folks able to cover so many albums and artists and they have to prioritize accordingly," Johnson adds.

"It's easy to hate Paris Hilton—lord knows that she and her friends like Brandon Davis are walking advertisements against the repeal of the estate tax—but any pop fan who listens to Paris with an open mind will find that it's nothing but fun."

— An excerpt from a 2005 review of Paris, the pop album Paris Hilton released that year. The review, written by frequent AllMusic scribe Stephen Thomas Erlewine, holds something of a controversial place among music fans, who can't believe that Paris Hilton might get a 4 1/2-star review on a platform like AllMusic. But the thing is, the review highlights the fact that the database is not about playing to the hipsters (some of whom, like me, are quietly smarting over some of AllMusic's other reviews) but to the genre. In this case, if you like pop, Paris is a high water-mark. "If you think about the concept of rating an album within the style and within the artist's catalog, our treatment of Paris totally makes sense," Johnson told me. (By the way, Stephen is related to Michael Erlewine—his nephew, to be exact.)

The Big Data Jukebox

The guts of AllMusic's massive machine

AllMusic may have been one of most ambitious sites of the early-internet era—and it's one that is fundamental to our understanding of pop culture.

Because, the thing is, it doesn't just track reviews or albums. It tracks styles, genres, and subgenres, along with the tone of the music and the platforms on which the music is sold. It then connects that data together, in a way that can intelligently tell you about an entire type of music, whether a massive genre like classical, or a tiny one like sadcore.

(Looking for a starting point? Johnson recommends the Editor's Choice page. I recommend the "Obscuro" genre page.)

All this information requires a massive database, one that has had to upgrade to keep with the times more than once. (These days, the database is powered by a combination of MySQL and MongoDB, in case you're curious.)

The sheer scale of the platform makes backend changes not a small endeavor—and on the AllMusic.com side of things, developers do much in the way of testing to keep things working properly, especially to keep its passionate fans—a small section of whom are ardent advocates for the site's classical section—happy.

"Making sweeping changes to a site that has something like 60 million URLs and over a hundred page types is not undertaken lightly," he says.

The Big Data Jukebox

Cub Koda, as presented in AllMusic's database. (Guenther Studios)

Five interesting facts about AllMusic and its massive database

  1. The database features content written by a number of professional musicians, most notably Cub Koda, whose claim to fame was fronting the band Brownsville Station and writing their biggest hit, "Smokin' in the Boys Room." (Koda died back in 2000; like his most popular song, the legacy of his words lives on.) Some of these musicians are native to the music scenes in AllMusic's home base of Ann Arbor, Michigan, as well as nearby Detroit.
  2. Yes, AllMusic occasionally changes its reviews, and sometimes its ratings. But its reason for doing so has less to do with a change of heart and more to do with a change in culture. Sometimes, Johnson says, albums may at first receive reviews that poorly capture the album's role in the zeitgeist, requiring a rewrite. Coldplay's Parachutes is a key example of this. (The original review, 1 1/2 stars, was brutal.) Additionally, albums rarely get a five-star rating initially, but only tend to earn it over time when cultural significance has coalesced around an album.
  3. If you're willing to dig enough, you'll find a few Easter eggs in the AllMusic database, some of which help surface the personalities of the authors. Perhaps one of the greatest Easter eggs is this biography of Brian Austin Green, the Beverly Hills, 90210 star and onetime rapper who is still married to Megan Fox. "Beverly Hills 90210 ended production in 2000, and Green has no doubt been totally busy and stuff since," the biography ends. (All the good stuff comes before that ending.)
  4. The corporate-parent split I mentioned above does sometimes create confusing situations for artists and labels that are trying to get albums added to the database, or for folks trying to get something corrected. (To help suss out the confusion, an important bit of advice: Check the FAQ, which goes over these issues in depth.)
  5. The company's offices were home to a very early Taylor Swift performance. She performed in the firm's lunchroom back in 2006, when she was 16. ("I had two strong feelings of the performance at the time," Johnson recalls. "One was that she was so tiny and skinny I feared for her health, and the other was that she seemed so sweet and naive I thought, 'You'll never make it in this industry, kid. They'll eat you alive!' So, clearly I am a terrific judge of someone's talent and ability.")

It's been suggested that writing about music is an idea as absurd as "dancing about architecture."

(By whom, we have no idea—Elvis Costello is generally considered the source for this quote, but he denies having ever said it. The best guess is Martin Mull, a guy best known for making comedy about music.)

In a lot of ways, AllMusic has proven this argument wrong. It's become a key part of the way we understand music online. It's an important resource for journalists, musicologists, and casual music fans alike. If we didn't have it, streaming music would be a lot less useful.

Johnson says that the role that AllMusic.com plays as a platform is somewhere in-between what you might get from Wikipedia (written by the users, often focused on just-the-facts-ma'am) and what you might see from Pitchfork (opinion-driven, narrower in focus).

"Many music sites are becoming more news-oriented while AllMusic has stuck to [its] guns of trying to be both an archival resource about the history of music, and also offering guides to find the new stuff that our users should listen to (just like when the idea started 25 years ago)," Johnson says. "So those other sites are definitely in our space, but I think we each offer something different to a different audience."

A quarter-century ago, the All Music Guide must have seemed like an absurd idea to the outsider, one with an unrealistic scope—think of the sheer number of man-hours that go into listening to all of those songs! But AllMusic persisted because, ultimately, we needed something like it.

Databases are our collective memory—with a lot more finality than a tweet, and more flexibility than a book or encyclopedia. In a hundred years, AllMusic is going to tell the story of music far better than it has any right to be told, with far more depth and nuance than a single Rolling Stone article could ever sum up.

And honestly, that's a pretty great cultural spot to hold.

When It Rains, It Pours

$
0
0
When It Rains, It Pours

Today in Tedium: Umbrellas are a good way to protect your head in case of inclement weather. Problem is, they’re not a perfect way. And that is made obvious by the fact that you’re sometimes hit by at least a little of the wet stuff even after you use such a device. If it gets windy (or even if it doesn't), their structural integrity breaks down easily, exposing users to rain. They can also be bulky, even unreasonably so. And they’re incredibly easy to forget and lose. For these reasons and others, this scenario is likely a common one: A person, frustrated by the device supposedly covering their head, thinks to themselves, “there has to be a better way,” runs to their garage, and starts working on a rethink of the device. Eventually, they form their idea, call up the big umbrella manufacturers, and go to the patent office, thinking their idea is unique. Turns out, they are far from alone. Today’s Tedium is about the exciting world of umbrella patents, because why the hell not? — Ernie @ Tedium

30+

The number of trademark challenges The Travelers Companies has filed against other companies attempting to create a logo with an umbrella baked in. “We have one of the best and most recognizable brands in the world and take seriously our responsibility to protect its value,” a spokesperson told the Wall Street Journal about its decision-making process, which has led the company to take on even tiny firms that would never hope to compete with the $33 billion company.

When It Rains, It Pours

(SSG Robert Stewart/U.S. Army/Flickr)

Your umbrella innovation probably isn't as clever as you think it is

Since 1790, the U.S. Patent and Trademark Office has had a mandate to help register the ownership of products to specific people and companies.

For nearly as long, people have been registering patents for umbrellas and all sorts of other things of varying complexity.

And it has its own designation from USPTO. According to the government office, an umbrella is generally considered "subject matter comprising an easily-portable canopy type having a cover, a stick, and a framework comprising stick-supported ribs and stretchers for supporting or shaping the cover."

That fairly broad canvas of a design (along with the "subcombinations and appurtenances peculiar to umbrellas" covered by patent law) has been used and abused in thousands of ways over the years, especially in the name of pet owners.

In 2008, New Yorker scribe Susan Orlean highlighted the fact that the patent office received enough umbrella patent filings at the time that four people were on the umbrella patent beat in the federal government. A search of Google patents shows that, since Orlean's article was published in February 2008, 1,617 new umbrella-related filings have gone up in the USPTO database in the "Walking Sticks; Umbrellas; Ladies' or Like Fans" section, which is where umbrellas, fans, or other items on sticks appear.

And all those patent filings don't seem to actually impress the manufacturers of those umbrellas. Orlean explained why:

Totes Isotoner, which is the largest umbrella company in the country, stopped accepting unsolicited proposals several years ago. One of the problems, according to Ann Headley, the director of rain-product development for Totes, is that umbrellas are so ordinary that everyone thinks about them, and, because they’re relatively simple, you don’t need an advanced degree to imagine a way to redesign them, but it’s difficult to come up with an umbrella idea that hasn’t already been done. The three-section folding umbrella, for instance, which seemed so novel when it was first manufactured, in the nineteen eighties, was actually patented almost a hundred years ago.

But those long odds and cynical comments still haven't stopped folks from trying. In 2012, for example, a pair of Taiwanese designers took on the rain using a bold concept design it called the Rain Shield, which takes on sideways rain and avoids getting blown out of the way by wind.

(Despite the buzz it received, however, it has yet to hit the market in a meaningful way. Bummer.)

That's a good idea. A lot of other ideas are terrible, of course.

When It Rains, It Pours

The five most insane umbrella-related patent filings we could find

  1. "An umbrella sheath is usable with the lightning rod to provide shelter from rain and to demarcate an area of lightning protection. An alternative lower electrode is disclosed for physical and electrical contact with water." — A filing for a portable lightning rod that, for some reason, comes with a built-in umbrella. The filing, when approved in 1983, was approved on utility grounds, despite a court stating, "we do not hesitate to say we would not consider using the claimed device for its intended purpose."
  2. "It will be appreciated therefore that there is a need for an umbrella to protect a pet from inclement weather conditions and which umbrella is under control of the individual walking the pet as well as enabling the pet to be under control of the individual via the umbrella and a leash in both umbrella opened and closed positions." — A filing for a combined pet-leash and umbrella, a device that raises the obvious question: What about the owner? Multitasking two umbrellas and a dog leash sounds like a great exercise in coordination.
  3. "This frame is set and riveted on the brim of the hat, and supports the whole mechanism. It may also be fixed below the hat-brim." — A filing for an umbrella hat that dates back to 1882. The innards of this thing had a number of pins that look a heckuva lot more complicated than actually picking up an umbrella.
  4. "The present invention relates to sunning accessories, and more particularly, to pillows with retractable umbrellas." — A 2004 patent filing for a pillow with a retractable umbrella, just in case you were expecting something else.
  5. "Briefly, to achieve the desired objects of the instant invention in accordance with the preferred embodiment thereof, a helium-filled sun shade is provided for protecting individuals engaged in outdoor activities." — A 1991 patent filing for something called a helium-filled sun shade. Sound confusing? Well, another way to put it is that it's essentially an always-on umbrella for the sun.

When It Rains, It Pours

(Jon Jordan/Flickr)

Diagnosing the problems that keep poor-quality umbrellas with us

You have to wonder why, in an era when we can shove a computer into a watch, we've struggled to improve on this basic design in a way that's truly gone mainstream.

Sure, the basic design of the umbrella is pretty simple, and it's somewhat built to last. And the Totes exec quoted in the Orlean piece has a point about it simply being too common for a new design to break through.

On the other hand, while everything else on our person tends to get an upgrade, whether it's our phone, our shoes, or our personal style, umbrellas persist despite having a design that struggles at the one job it's been tasked to do. Why is that?

If I had to pin it on a single item, it'd be the fact that people don't want to actually pay for umbrellas. According to a 2010 report done by Accessories Magazine in accordance with NPD Group, the average price of an umbrella was just $6, a price point that suggests "disposable commodity" rather than "thing I actually care about. (That said, the report notes that 80 percent of the industry's profits come from umbrellas sold above a $5 price point. So maybe people who really like their umbrellas are willing to spend more?)

That issue is not limited to the present day, either. Umbrellas and Their History, an 1855 book by William Sangster, noted that a lot of innovation was happening in umbrellas at the time, but consumer interest was nonexistent in most of these innovations, as Sangster explains:

Simple as the construction of an Umbrella may appear, the number of patents that have been granted within the last thirty years might have been enormous, and a small book might be written on them, so it is of no use to attempt, in our small space, to more than mention a very few of the various improvements in their manufacture. With very few exceptions the inventors have not been repaid the cost of their patents. This has arisen, partly from the delicacy of their mechanical construction, unfitted for the rough usage to which Umbrellas are exposed; but chiefly in consequence of the increased cost of manufacture not being compensated by the improvements effected.

There are probably a variety of reasons for this incredibly frugal state of affairs—for one thing, wetness is a temporary condition—but I would say a big one is that umbrellas are really easy to lose. In fact, thousands of umbrellas get lost each year in London's public transportation system—10,907 in 2014 alone, according to the BBC.

Sure, people should be better about keeping their stuff, but in a world where we use many of the things in our bags for multiple reasons, a bulky single-use device is going to be the first to lose out.

Charles Lim, the author of the site Crooked Pixels, suggests that it's this disposability that makes it unnecessary to redesign umbrellas at all. In fact, he suggests (while responding to yet another umbrella redesign) that a better strategy would be to buy the cheapest, tiniest umbrellas you could find—say, from a dollar store—and carry one on you at all times.

"So what do you do with an imperfect dollar store umbrella? You throw it out," he wrote on his blog. "It’s disposable, like a condom. Most umbrellas sell for 10-20 dollars. A small Dollarama umbrella is two dollars. At that price, you can buy more than one because hey, you never know."

Maybe all those weekend designers out there are solving the wrong problem. Maybe umbrellas need to be made of the cheapest, smallest, most recyclable materials one can find. And maybe we should just admit to ourselves that we're going to ditch them after a short period of use.

Maybe our inventors haven't caught up with where the market actually is.

$344k

The amount that a UK-based Kickstarter called the KAZbrella raised last year. The company clearly was onto something because some copycats appeared, likely because the umbrellas pulled off a neat trick: It figured out a way to produce an umbrella that traps the moisture inside, rather than leaving it outside—while still looking exactly the same as an umbrella you'd see on the street during regular usage. When the umbrella goes on regular sale, it won't be cheap, though: It'll sell for a not-insignificant $58 a pop.

The other day, as I was leaving home, I forgot my umbrella. It wasn't the first time I've done that, nor will it be the last. In fact, I'll probably do it more often than not.

It was raining during the day, sure, but it was doing so in such a way where there were pockets of dryness. I ended up sticking around where I was at, waiting around for one of those pockets of dryness to show up, at which point, I skidaddled as a fast as I could to avoid another storm.

If there's something I could redesign about the umbrella, I would make it possible to only be there when I need it, then to go away when I don't. If we figure out how to teleport stuff one of these days, the first thing we need to teleport is umbrellas into the hands of people walking out into rainstorms.

It's a single-use device in a many-use world. But without it, we're soaked.

Adapters, Unplugged

$
0
0
Adapters, Unplugged

Today in Tedium: Power outlets are frustrating, but the devices we use on a daily basis? They don't necessarily make things easier. In fact, the bricks that come with many modern devices, from your router to your Bluetooth speakers to your game console, tend to be giant, and at first, they were designed without any consideration for the fact that there was ample competition for your limited outlet space. Why have AC adapters traditionally been so frustrating? Today's Tedium explains. — Ernie @ Tedium

1972

The year that the power strip was invented. It's not an American invention, but an Australian one. The idea was formulated by Kambrook founder Frank Bannigan and his understudy, Peter Talbot. (The initial design looks closer to a boombox than the array of outlets we're used to.) Despite inventing something that soon became a part of households all over the world, Kambrook never patented the device. "I've probably lost millions of dollars In royalties alone," Brannigan said of the oversight back in 2000. "Whenever I go into a department store and see a wide range of power boards on offer, it comes back to haunt me."

Adapters, Unplugged

(Toshiyuki IMAI/Flickr)

Why "wall warts" put their pockmarks on good outlets everywhere

What's the difference between a cheap product and one that's been heavily designed?

In some cases, it may be the wall wart, or the power adapter that transforms energy while directly attached to an outlet.

Here's why they exist: Most modern electricity systems rely on alternating current, or the type of energy Nikola Tesla made his name on in the late 19th century, besting Thomas Edison in a well-remembered feud over how Americans should power their lightbulbs.

However, most of our electronic devices internally use direct current, which is more efficient for low-power devices and offers a constant voltage level. If the current alternated on a circuit board, it would be far less effective at delivering messages from the CPU to your screen, just as an example.

The AC adapter is effectively the bridge between how your devices actually use energy and how the power system delivers that power. The transformer baked inside turns the high voltage in the grid's power stream into something manageable for your Casiotone.

Now, you might be wondering to yourself, why isn't this functionality included directly in our devices? Why can I plug in a TV without an extra adapter, but my router needs a wall wart? Simply, it comes down to a few things, including the cost of the device, the efficiency of the design, and what's most likely to break. An AC adapter removes a point of failure from the product itself.

But these devices come with a lot of complications. The biggest, perhaps, is the fact that the bricks aren't horribly efficient at their jobs, which is why they tend to overheat. But the other problem is their size—the earliest designers of AC adapters were attempting to solve an engineering problem, essentially, without much thought to how the design of the brick would interact with every other device that needs to be plugged in.

Another problem is that various devices each tend to have different voltage needs, which means that many of these adaptors aren't compatible with one another.

Some of this is understandable—different products require different types of voltages, and that might make a closed loop desirable for some designers—but it creates a lot of waste.

As a result of these problems, AC adapters generate a lot of hate in some circles.

"AC-plug power supplies are a cheap workaround to various engineering, economic, and regulatory problems that manufacturers face, and they solve those problems by pushing them off onto end users," wrote OS News contributor David Adams.

It's a bad design that sets a bad precedent, but it likely stuck around because, when it comes down to it, it's cheap and solves a problem.

And that solution lets electronics manufacturers focus on the parts of the products they can do justice.

"Accessory manufacturers have accorded serious thought to the problem of converting battery operated sets over to A.C. operation and solved it by the development of an A.C. adaptor harness whereby practically any standard five or six tube D.C. filament radio receiver can be converted into an A.C. filament operated installation, without molesting any of the wiring within the receiver cabinet."

— An article in the February 1928 edition of Radio Engineering, offering one of the earliest mentions of an AC adapter designed to convert DC power to AC power—for the purposes of allowing a battery-operated radio to operate on the power grid. The device, advertised by the H.H. Eby Manufacturing Company in the previous month's issue of the same magazine, was intended specifically to allow AC-based vacuum tubes to be used in DC-based radios.

Adapters, Unplugged

How Sega created an outlet conundrum for its biggest fans

If you owned a Sega Genesis back in the mid-'90s, you were most likely aware of the fact that, unlike for your friends that owned Super NESes, you had an upgrade path.

And it wasn't just limited to a Game Genie, either. Sega was more willing than its competitors to release add-ons for its system that expanded what could be done with it. Most notably, there was the Sega CD, but additionally, the company also released the 32X, a stopgap solution that made the Genesis into a 32-bit machine, at the end of 1994.

But these three devices together created a conundrum for gamers. See, each one of these products had its own AC adapter, and you couldn't use the three devices together unless they were all plugged in. They didn't share a power supply.

This was a problem for the average power strip. Power strips at the time were designed to be very narrow and with their adapters each aligned vertically. If you have a single device with an AC adapter on a power strip, this isn't a problem, because it can be attached to the end of the strip. But if you have three, you suddenly have a traffic problem. Simply put, you can't plug in a TV and three AC adapters on a single power strip. You'll be lucky to even fit the three power adapters.

Adapters, Unplugged

(via Sega Retro)

Sega immediately realized the problem it created for gamers who really wanted to play Fahrenheit. Its response? With the release of the 32X, it also started selling an officially-branded power strip, one that flipped the outlets so the AC adapters could be plugged in side-by-side.

The 32X was a historic flop for Sega, one that was pushed off the market within about a year and a half of release (the timing of the add-on was particularly bad, because the firm had already announced its forthcoming Saturn console), but since then, Sega enthusiasts have figured out a few interesting tricks to solve this unusual state of affairs. In 2013, one enterprising gamer discovered that, if he used an aftermarket video cable with an additional pin attached, he could effectively run a 32X and Genesis on a single AC adapter. (You have to wonder what sort of technical reason/fire hazard was at play that led Sega to not do that in the first place.)

And some folks, such as this British user, have taken to retrofitting laptop chargers to replace the AC adapters.

Perhaps the most interesting, however, is a homebrew product specifically targeted at Genesis die-hards. For the past few years, a firm called Retro Game Cave has been selling a device called the Sega Trio—an AC adapter with three plugs, allowing you to plug in all three devices at once. The adapter, by the way, is far smaller than Sega's own power adapter, and reportedly generates less heat. But since it's just one tiny firm producing them, the adapters frequently sell out, likely thanks to fawning YouTube reviews like this one.

(The firm also sells an adapter that can power an NES and Super NES on a single device.)

"One of the ways that our Nation wastes energy is through what they call vampire devices. These would be battery chargers, cell phone chargers, computer systems that we—we really think we're not using energy when plugged in but, in fact, are."

— Former President George W. Bush, in a 2001 executive order that requires the federal government to use devices that require low wattage when idling—while simultaneously associating electronic devices with vampires. AC adapters are generally seen as notable examples of this problem, as they tend to draw power even when not in use. And often, people will leave the adapters plugged into the outlets even when they're not plugged into anything else. "It’s as if you decided to pour yourself some orange juice, filled up the glass, and then just left the jug lying on its side, the top off, contents spilling everywhere," suggested Grist's mononymic advice columnist, Umbra. "Why, people, why?"

These days, you're much less likely to run into a Sega adapter situation. After the Genesis, the company started including power adapters in its consoles, which nonetheless failed to win over the market. (Sony followed suit with this design strategy; even modern Playstation consoles have integrated power supplies.)

We haven't gotten rid of power bricks overall—see the Xbox One's ginormous block—but game manufacturers have at least designed them in a way that they no longer get in the way of actually plugging them into a power strip.

And, outside of the world of video games, cheap electronics are all over the place with their AC adapters, meaning that odds are good that we may never be rid of them. I see three solutions to this problem:

Make AC adapters universal in design. Back in 2012, the United Nations attempted to create a universal power adapter standard that would be pushed on just the kinds of devices that use AC adapters—like cordless telephones and set-top boxes. If we can make AC adapters easy to plug in to a couple hundred devices, we might just save people some money on adapters.

Make more stuff USB-compatible. USB has the potential to do more than solve the global power-outlet problem. For lower-power devices, it can make power outlets a thing in the past. Its success as a phone-charging platform, partly driven by the UN a few years back, also helps sell the UN's case for universal AC adapters.

Switch the grid to direct current someday. In recent years, there's been chatter in the energy world about moving toward a DC power-based grid, a move that sounds a bit pie-in-the-sky, but nonetheless has some momentum, thanks to a group called the eMerge Alliance that's been making the case that it could be more efficient. Additionally, certain users, like data centers, might find going all-DC worth their time.

That last idea might sound a bit crazy, but if you own a car, you likely already use DC power outlets on the regular. They're called "cigarette lighter receptacles," and they offer up to 12 volts of energy to your devices.

And they have converters to AC power, too. They certainly can't be any worse than what Sega did to our power strips back in the day.

Make Every Piece Count

$
0
0
Make Every Piece Count

Today in Tedium: You may not like meat, but even if you do, you may not think of it in terms of innovation. It sounds like a concept incompatible with the idea of meat—we do so many things with cows, chicken, fish, and pigs that it's difficult to consider these animals in terms of reinvention. We've already mashed these four types of animals into combinations more creative than Legos, taken their parts and turned them into products that don't even fit into the same genre as dinner. But there is in fact still innovation is to be had at the dinner table. Today's Tedium talks about meat science, checkoff programs, and innovative cuts of beef. Yes, there is innovation in the beef-cutting space. — Ernie @ Tedium

$1

The amount of money, per head of cattle, that goes to the American beef checkoff program. (Organic beef, however, is generally exempt from this per-head rule.) This money began to be set aside thanks to the Beef Promotion and Research Act, a 1985 law that essentially sets aside a ton of money that allows the beef industry to promote itself, as well as invest in research and development. Essentially, it's a tax on beef production, but the tax goes directly back to the beef industry. Most other kinds of animal-based products, including pork, milk, and eggs, have similar checkoff programs.

Make Every Piece Count

(jeffreyww/Flickr)

The story of flat iron steak, which was discovered by a couple of scientists

The beef checkoff program, which you're probably most familiar with through the "Beef: It's What's for Dinner" commercials that have run periodically since the '90s, doesn't just exist to tell you how awesome beef is.

It also has an important role in improving the value of beef as an investment. Let me explain: Basically, a cow has a set price, a specific value, and that value is set by the market. (Don't believe me? Check the market data.)

If too many cows are on the market, the value of a cow drops. if too few, there's a shortage and the price of a steak at the supermarket might be higher than normal. And some types of cows are more expensive than others. You know, basic economics.

At the same time, some parts of the cow are more valuable than others. Tenderloin, for example, tends to be more expensive than chuck. Certain sections of the cow get a prime spot in the meat section; other parts get ground up and spread about every other part of the grocery store.

But there's always room to make a cow more valuable, and that's where the work of Chris Calkins and Dwain Johnson comes into play. Calkins is a meat scientist at the University of Nebraska, while Johnson plays a similar role at the University of Florida’s Institute of Food and Agricultural Sciences.

Together, these men had a great idea: Let's research the muscle structure of your average cow, focusing on the cheaper cuts, and seeing if there might be some hidden value in there.

The duo, with the help of the National Cattleman's Beef Association, researched new cuts of meat from parts of the animal that would have otherwise been turned into hamburger. In the largest study of its kind in 2000, the scientists tested 5,600 muscles for flavor and tenderness. They found 39 worth their time.

Sounds like a weirdly specific line of work. Guessed who helped fund that research? That's right, the beef checkoff program, which quickly saw dividends from this work.

See, in 2002, these guys hit the jackpot: They uncovered an impressive new cut of meat called a flat iron steak, an inch-high cut of meat from the shoulder area of the cow, a place more traditionally associated with cheaper chuck beef that's ground into burger.

Essentially, the researchers discovered that if the connective tissue of the cow was trimmed off in a certain way (as shown in this clip), the cut of meat left behind was flavorful and punched above its weight class compared to more costly types of steak.

“Supposedly named because it looks like an old-fashioned metal flat iron, the flat iron steak is uniform in thickness and rectangular in shape,” Johnson said in a 2007 press release about the research. “The only variation is the cut into the middle where the connective tissue has been removed.”

That cut quickly became hugely popular with the public, thanks to the fact that it was cheap, but still very high in quality. High-end restaurants that would have skipped chuck in building their steaks put the flat iron steak on their menus.

A report from Beef U, an education offering provided to the food service industry via the beef checkoff program, highlights the benefits of this kind of research throughout the system:

Turning the underutilized chuck and round into these new cuts means more profitability and higher margins for foodservice operators. In addition, while these cuts have a significant impact year-round, foodservice operators can leverage key benefits during certain times of year. For instance, when both demand and price for steaks increase during summer months, operators can feature more steak options, increasing traffic and margins.

By 2012, according to the Meat Institute, the flat iron steak was responsible for around $80 million in sales.

Clearly, there's something to be said for meat science—or at least for clever naming strategies.

“The new names will help change the way consumers and retailers talk about pork. But more importantly, the simpler names will help clear up confusion that consumers currently experience at the meat case, helping to move more pork in the long-term.”

— Conley Nelson, the president of the National Pork Board, discussing the reasoning behind changing the names of a number of different cuts of pork, something the board did in 2013. The board teamed with the National Cattlemen’s Beef Association and figured out that the names of different cuts of meat were confusing to consumers. As a result, the pork board decided to rename its cuts in a style closer to beef. So as a result, a loin chop is now a "Pork Porterhouse Chop" and a rib chop is a "Pork Ribeye Chop."

Make Every Piece Count

Amilton de Mello, the inventor of the bonanza cut. (University of Nevada-Reno)

The latest cut of beef? Well, it's a bonanza

Since the flat iron steak took hold, the meat industry has been on the lookout for new cuts of meat that could prove just as buzzy for the public.

in recent years, more cuts of meat have tried to follow the flat iron's carefully-cut path from the butcher to the grocery aisle.

Among the more notable ones:

Denver Cut Steak: This variety, found in the same bout of research as the flat iron steak, was first introduced to the public in 2009, and also comes from the chuck area. The steak, despite being seen as intensely flavorful, has struggled to have the impact of its more famous cousin, according to Denver's alt-weekly, Westword, which wrote a lengthy feature last year about the steak cut named after its city.

Vegas Strip Steak: Business consultant Tony Mata (who worked closely with researchers on the flat iron steak project) worked with researchers at Oklahoma State University to come up with yet another kind of meat from the chuck area—one that requires a non-standard butchering strategy. But what's most surprising about the cut is how it's being sold: Mata and the researchers patented the cut and are trying to license it out.

And these new kinds of cuts, controversial as they may or may not be, are still being invented to this day. Last month, in fact, the University of Nevada, Reno revealed that one of its on-campus meat scientists, Amilton de Mello, found yet another new cut near the chuck area—one he's calling the "Bonanza Cut."

De Mello, who came to the university last December, says that the cut is actually near where the flat iron steak is.

"When you separate the chuck and the ribs, the Flat Iron steak goes one way—with the Chuck—and the relatively small end stays with the rib side; this is the Bonanza Cut," de Mello told Nevada Today, a publication of the university.

A big part of the reason de Mello spotted it was some good perception. According to the Las Vegas Review-Journal, he spotted the piece of meat while working at a beef-processing plant, thought it to be well-marbled, and decided to test it out.

The high-quality cut of beef is tiny—the two four-ounce pieces in each cow may struggle to fill you up on their own—but the high fat content (estimated to be 13 percent) gives the cut a richness that, if sold properly, could prove extremely valuable for the beef industry.

And the cut has a fan who knows a thing or two about discovering new cuts of meat: Chris Calkins, the University of Nebraska professor who helped discover the flat iron steak. He made it to the introduction party in Reno.

"Upgrading this meat from a ground beef/trim price to steak-quality price should return more dollars to the industry," Calkins said, according to Nevada Today. "I anticipate a positive reception for the Bonanza Cut, especially from countries that recognize U.S. beef for its quality and flavor."

Spoken like a guy who knows his steaks.

“I don’t think it’s unconventional as much it is expanding on convention. Diners have a growing interest in sustainable eating and the nose-to-tail cooking movement has opened minds to new options.”

— Michael Stillman, the owner and founder of Fourth Wall Restaurants, explaining to Zagat why unusual cuts of meat are suddenly in style at fine dining restaurants, like his Quality Eats restaurant in Manhattan. Unusual cuts have a significant advantage for restaurants, which can use these cuts to lower the price range for higher qualities of meat (think kobe beef) to something approachable for more types of consumers.

When reading about these newly discovered cuts of beef, the cynic in me wonders: Are we being fed a line of bullshit?

(The smart aleck in me helpfully points out that we're not, as that's a different part of the animal.)

These animals have been feeding humans for ages. Are we learning new ways to produce and eat them, or are we just getting better at marketing? Or is it both?

In a lot of ways, animals in this raw-meat form are like putty, with the value of their many parts arbitrary. We only think filet mignon is the best cut of meat because that's what we've been told as a society, and decided it's worth our money for reasons less scientific and more a matter of preference. The new-cut strategy exploits those preferences and tries to find them in as many places as possible.

On the other hand, the reason why researchers have discovered these new cuts of meat is still worth pondering. It's evidence that, if we look hard enough at something and consider moving past the limitations we've created for that product, there's a chance we'll find something new, something that adds value.

You may not care about the difference between a Porterhouse steak and a Vegas strip. Despite the fact I wrote this, I certainly don't. But if there's an issue in your way, a frustration you're trying to get past, it always helps to have a fresh set of eyes, a willingness to try a new tactic, and a little bit of ambition to move forward even if people think your idea is a little crazy.

I know, that's a surprisingly deep point to take from a bunch of scientists cutting meat in clever ways, but hey—it's a meaty subject.

Getting The Next Word In

$
0
0
Getting The Next Word In

Today in Tedium: Word processors are generally kind of boring—they do their job, and that's about it. Most people don't put a ton of second thought into their word processors. As you might have noticed from my rant from a couple of months ago, I'm not that kind of person. That rant drew up a lot of conversation, most of it good and interesting. (It did have its critics, but even their comments were interesting!) But it hit me that, in a lot of ways, I was spending more time complaining than discussing ways to move text editing forward. Today's issue of Tedium attempts to look at the future of word processing. — Ernie @ Tedium

"It's as though people cannot see text, they see only through text. Some people want to talk about knowledge, some people want to talk about collaboration … very few want to talk about text."

— Frode Hegland, the founder and an organizer of The Future of Text Symposium, an annual event that's designed to highlight discussions around better tying computing to the written word, discussing the way that text gets short shrift in revamps of the word processor. "The analogy that really stayed with me is, it's as though people want to talk about making a better window, but they don't talk about the view," he told me of the most recent event, held in August at Google's Mountain View, California headquarters.

Getting The Next Word In

(James F. Clay/Flickr)

Embracing text for the sake of text

The thoughts that our brain generates come out best when they work together consistently, in a single waveform.

The problem is, when you write stuff in a word processor, you're constantly pulled away from your ideas—you keep having to refer to your research, to keep looking stuff up, and on a computer, you're constantly switching between windows.

This creates a lot of friction and, mentally, you're stuck trying to sort between four or five different layers of messy thoughts. Things get chopped into pieces and connections fail to get made.

I do my best to avoid that kind of friction, but it's tough. So it was nice to find a kindred spirit on this issue in the form of Hegland, who has long spent time obsessed with this idea of "liquid information," or how words and ideas can flow together in one fluid swoop.

The British developer and academic, who has been focused on big-picture issues regarding information for decades, has developed an app specifically designed to solve this problem, a macOS add-on called Liquid | Flow that allows users to use key commands to select a piece of text and quickly perform an action on it without even thinking. (I've tried it; It's smooth like butter.)

Hegland's annual The Future of Text Symposium—which this year covered both interesting approaches to word processing like Jesse Grosjean's ultra-minimalist WriteRoom and outré concepts like Robert Scoble pondering text in the era of the Magic Leap—has a nice throw-out-all-the-cards kind of feel.

There were a couple of living legends in the room that day, including Vint Cerf, a co-inventor of the internet, and Ted Nelson, who came up with the word "hypertext." But in many ways, the direction of Hegland's career and his focus on the written word has been defined by a mentor who wasn't even at the August event: Douglas Engelbart, the early tech innovator whose "Mother of all Demos" helped shape the functionality of the desktop computer. (Engelbart died in 2013.)

Hegland said that his focus on the written word has in part been defined by his personal relationship with Engelbart, and he's trying to bring some of Engelbart's thinking to his word processor project Liquid | Author. The app, designed for both macOS and iOS and publicly released for the latter, has an interestingly narrow focus—rather than trying to revamp word processing all at once, it attempts to tackle how teachers and students interact with documents, an interaction that many people are familiar with.

"It should be instant for students to create a citation—web, book, article, whatever—and it should be equally instant for the teacher to check that citation," Hegland explained of the app's strategy.

Another goal with Liquid | Author, he says, is to use his Future of Text gatherings almost as a skunkworks project for new ideas that can be implemented in the project, which he'll likely focus on as he works on his Ph.D.

It certainly won't be a hugely commercial project—something Hegland is quick to admit—but it will be a fascinating one to watch.

"As the legend goes, when Steve Jobs 'borrowed' the ideas behind personal computing from Xerox PARC, many subtle, but crucial, differences were lost."

— A page on the website for the document editing tool Notion, explaining in part why the company built the tool, which combines word processing and document collaboration. The company talks at length on its website about trying to "bring back some of the ideas of those early pioneers." That isn't really an argument being made by a lot of companies making word processors these days. (FWIW, Notion is probably the most forward-thinking of the up-and-comers out there, a list that includes Dropbox Paper and Gingko.)

Getting The Next Word In

A modern take on the word processor that isn't really a word processor at all

Maybe the issue with our word processors isn't that there's too much crap. It's just that neither you nor your company cares about anyone else's.

The ambitious, Github-famous text editing tool Quill is designed to consider the process of text editing from this perspective. Rather than treating word processing as a single monolithic beast, it's an open-source project—hack it however you need it, thanks. Author and lead maintainer Jason Chen, who helped start the project while at SalesForce, suggests the strategy is intended to allow companies and end users that rely on the platform to create something that more specifically suits their needs.

"It was a common saying that 95 percent of features in Word are never used, but the problem is that different groups of people use a different 5 percent," Chen explained in an interview. "I think these big companies are looking for a tool that just does the 5 percent that they care about and skip the rest."

So far, the approach—which he says is inspired by both Medium's graceful, opinionated design and Etherpad's strong, hackable foundation—has won some prominent fans, such as Vox Media, Gannett, and Hubspot. And the hits keep coming. Last month, LinkedIn revamped its Medium-competing publishing platform to use Quill, with the engineering manager of the publishing platform praising the tool's technical power and flexibility.

"As LinkedIn publishing evolves, Quill’s underlying technology opens the door for rich features, such as collaborative editing and custom rich media types," the company's Jake Dejno wrote.

The strategy, Chen says, reflects a reframing of the place of word processing in the modern day. He argues that word processors generally haven't kept up with the interactivity of the internet (stuff like embeds, you know the deal); simultaneously, though, we have what Chen characterized as a "diminishing need for their strengths."

"The reasons we have traditionally used word processors has slowly been eroded away," he explained. "LinkedIn is replacing the resume, Github is replacing documentation, and blogging (and respective tools) have chipped into journalism. Even documents that are meant to be printed are largely being standardized and automated. Most letters in your physical mailbox today are probably from some bank that generated and printed it without touching Word."

By putting the rendering of words into the hands of the end user—and doing so, it should be said, with state-of-the-art technical standards—projects like Quill could redefine our relationship with the word processor … by, in some ways, removing it from the equation entirely.

So, where does the future stand for word processing in the long run? Are we still going to be using an all-in-one tool for handling the written word down the line here? In posing that question to Chen, he suggested that in the shorter run, companies will begin to use more specialized tools, but further down the line, he believes we may see significant changes in the way we interact with written words.

"In 15 years I think the input interface and the idea of a 'document' to be entirely reimagined," Chen suggests. "I don't think we'll be at telepathy in that timeframe but I can see a lot more speech and audio playing a larger role for input. Video is already augmenting the document, but VR could be ubiquitous by then."

But thinking of audio and video as a replacement for a written word, while an interesting idea, gives me pause—for a few reasons. One issue comes up as I click through the recording of my Skype conversation with Frode Hegland: I find the sound of my own voice terrible, because I tend not to speak as cleanly as I write. (Man, how do people deal with the fact that I'm a 35-year-old guy who repeatedly says "like" and "you know" in the middle of a conversation? No wonder this isn't a podcast.)

Hegland, who is much more eloquent than I am, suggested that natural language processing was potentially going to change the way that we write—particularly real-time summarization techniques that could tell you whether your story makes sense on the fly. He specifically pointed to the work of Bruce Horn, an early Macintosh developer who now works for Intel, as well as Stanford linguistics professor Livia Polanyi.

But, ultimately, Hegland still saw a place for the written word, as well as the tools that traditionally put those words there.

"Over the past few years I've come to appreciate that freedom of [mental] movement is the key," he said, highlighting the nature of liquidity in putting thoughts to the page. "When you look about the freedom of your own hands moving, you have such incredible freedom of movement."'

(Hegland isn't a Markdown fan like myself, but hey, pobody's nerfect.)

As long as the freedom of mental movement in my hands goes faster than my own voice, I'm probably going to have a need for a keyboard. But what about the next generation? Will word processors eventually get thrown out into the annals of history like a stream of old Motorola RAZRs and BlackBerry phones?

God, I hope not. I still have a lot of words to write.

No Acquiring This Taste

$
0
0
No Acquiring This Taste

Today in Tedium: We love things that taste good. We hate things that taste bad. Perhaps it's for that reason we don't spend a lot of time talking about terrible-tasting things. But there's plenty of reason that we should. The biggest is this: Sometimes, hiding in that terrible taste might be something so important that it can change the world in a noticeable way. Today's Tedium is about a really terrible flavor that's hiding all over your home. Why haven't you tasted it yet? Well, let's just say that, if the flavor does its job right, you never will. — Ernie @ Tedium

"Celebrity chef Anthony Bourdain famously called 'hákarl' the worst thing he had ever eaten. This may have been coloured by an overall miserable visit to Iceland or by the fact that Anthony Bourdain is a huge sissy."

— Ragnar Egilsson, the former food editor for the Reykjavik Grapevine, calling out our foremost American reality-show food critic for his inability to appreciate hákarl, a variety of fermented shark that's considered a delicacy in Iceland and is considered the bane of reality show chefs around the world. (At least Bourdain kept it down. Gordon Ramsay didn't.) For what it's worth, real-life Icelander Egilsson claims the bizarre foodstuff smells ("like leprosy") worse than it tastes ("a tangy cheese").

No Acquiring This Taste

(Anathea Utley/Flickr)

There's a chemical that's being used to discourage kids from eating their toys

In 2007, a toy manufacturer had a bit of a disaster on its hands when it was discovered that at least some of the beaded toys it produced were coated with a chemical so dangerous that, when it metabolized, it turned into the "date-rape" drug GHB.

Beads are small, and they're edible. So you can imagine what happened.

The manufacturer of Aqua Dots had a huge problem on its hands after learning this fact, and the U.S. Consumer Product Safety Commission quickly recalled more than 4.2 million toys in an effort to control the problem.

But eventually, these toys came back on the market in a different form, under a different name. And when they did, they were covered with another kind of chemical—this one designed to prevent kids from eating the beads.

That chemical, denatonium benzoate, goes by the brand name Bitrex, and it's been around since the 1950s. It's currently used in substances as diverse as antifreeze, perfumes, household cleaners, and pesticides. Only recently has the chemical come to the world of toys. But it packs a hell of a punch—a single molecule of Bitrex can make a million molecules of water taste horrible.

If that level of bitterness sounds like fodder for a series of YouTube-style challenges, YouTube is already way ahead of you. Here's a G-rated taste-test from a radio-station morning show crew that was put up to it by a nonprofit organization; here's an R-rated test from a guy who runs a YouTube channel dedicated to eating weird things.

But as we learned in the case of Aqua Dots, this material has some important uses.

"This compound is several magnitudes more bitter, and the bitter taste persists in the mouth for a considerable time. Rice which is contaminated with this chemical in amounts of 0.10 pound per ton is inedible. The bitter taste was so nauseating that no one who tasted the boiled rice was able to consume as much as a teaspoonful."

— A 1968 patent filing from the U.S. Army describing "compositions and method for degrading foodstuffs," specifically highlighting how effective Bitrex was at making food inedible. So yes, the U.S. Army, according to this patent, sees Bitrex as a useful form of chemical warfare that could ruin the enemy's food. While other chemicals were described, the characterization of Bitrex was particularly impressive.

No Acquiring This Taste

(Mike Mozart/Flickr)

The woman who convinced household cleaner manufacturers to make poison taste bad

This incredibly potent flavoring of Bitrex proved handy for a problem that arose in the early '80s, when reports of children being hospitalized for accidentally ingesting household chemicals became commonplace.

The logic is simple. If you make dangerous chemicals taste bad, kids won't eat or drink them.

As New Scientist explained in a 1985 article:

The sensible answer, then, is to make these household chemicals taste so repellant to a child that its immediate reaction if it puts some in its mouth is to spit it out. What is required is a compound so vile in taste that it cannot be tolerated. There are, in fact, several such substances, both natural and man-made, but one that stands out above all others is denatonium benzoate, or Bitrex, as it is commonly known. This white, non-toxic powder, which is soluble in both aqueous and organic solvents, is listed in the Guinnes Book of Records as the bitterest substance known. Adding just one teaspoon of powder to a tankerful of water would make the water undrinkable.

Problem was, it wasn't a given that household cleaner manufacturers would use this substance, particularly in the U.S.

That's where the hard work of an Albany, Oregon, woman named Lynn Tylczak came into play. Tylczak heard about Bitrex getting used in cleaners made in Europe, but found out that the issue was getting ignored in the U.S.

In the 1980s, well before the days of email, she started a letter-writing campaign—first with chemical manufacturers, then with politicians, neither of which she had much luck with. She had a much better track record, however, when she and 20 of her neighbors started reaching out to the media and various consumer groups about the problem.

"I wrote to about 20 of the big newspapers, then I wrote to consumer groups, magazines, health magazines, insurance magazines, the people I thought would pick it up," Tylczak told the Los Angeles Times in 1989.

The media notice worked; soon, the politicians (including New York Sen. Chuck Schumer, then a member of the House) were knocking on her door pledging support. Noted consumer advocates, like Ralph Nader, were singing Tylczak's praises, and the National Safety Council quickly called on manufacturers to add Bitrex to their products.

As it turned out, at least one company, Procter & Gamble, already had. After doing market research in the early '80s, the company added the chemical to two varieties of laundry detergent after it was found that children were more susceptible to drinking those kinds of detergents over others. But its comments highlighted the fact that resistance lingered.

''We don't advertise the use of Bitrex because we don't want to communicate the notion that our products are not safe if they don't have Bitrex,'' company spokeswoman Jennifer G. Bailey told The New York Times. ''All of our products contain an emetic that would induce vomiting.''

Procter & Gamble was nothing compared to the antifreeze industry, which apparently got a PR firm to spy on her operation, according to a Covert Action Quarterly report.

She recalled in comments to The Giraffe Project: “One major anti-freeze manufacturer saw the Poison-Proof Project on CNN and decided to use bittering agents. His comment was, ‘It would cost us more to fight this than to do it.’ Doesn’t anybody just plain want to make a safe product?”

Within a year and a half, the success of Tylczak's grassroots efforts were bearing fruit—she quickly became a fixture on television talk shows, her efforts had gotten notice in Congress, and the industry started changing its ways. And in 1995, a law requiring antifreeze to contain the chemical was passed in Oregon.

These days, Bitrex is commonly used in all sorts of products you shouldn't drink. So if you accidentally have a toxic chemical in front of you and you feel like taking a swig, you can thank Lynn Tylczak for ensuring that you spit it out almost immediately. (Not that you should even try to drink a toxic chemical. That's stupid.)

"As we age, we don’t have as many taste buds, and we can get used to bitter flavors. As we experience more bitter flavors, we are more likely to crave and appreciate the digestive powers of bitter, which can, for example, balance fatty foods."

— Author and chef Jennifer McLagan, explaining to Publishers Weekly why bitter food is quite often an acquired taste—one that we don't learn to appreciate until we become adults. She wrote a cookbook about bitter foods, because of course she did.

Now, Bitrex is an odorless chemical compound discovered by accident one day by a group of researchers. It didn't have an ideal use at first (beyond making industrial alcohol inedible), but eventually it became an important little chemical for our daily lives.

But the world of chemical production doesn't have a monopoly on terrible-tasting things. Mother Nature has a few of its own, such as durian, an incredibly smelly fruit that has an acquired taste—some people either love it or hate it. It smells awful, but those who can get past the smell might find a taste they love.

But oh, that smell—one food writer, Richard Sterling, describes it as "turpentine and onions, garnished with a gym sock. It can be smelled from yards away.”

Smithsonian magazine, in a piece on the bizarre fruit, says that scientists have analyzed that smell, in an attempt to nail down why it's so pungent, and have found a situation unlike that with Bitrex. It's not a single chemical compound—but numerous ones, each evoking different kinds of smells. At least 50 different ones, many of them things you wouldn't want to smell individually, let alone mashed into one super-smell.

If you truly wanted to mess with someone, obviously, you would put Bitrex on a durian. Sounds like an endurance test I'd watch on YouTube.


We Were Selling Computers All Wrong

$
0
0
We Were Selling Computers All Wrong

Today in Tedium: The big-box technology store isn't dead yet—and in the case of Best Buy, it's actually doing pretty well, beating financial estimates by a country mile. But as a whole, the concept of the sprawling electronics chain is far from its peak. In fact, it's sort of another shade of the office-supply store. Despite that, investors keep trying to revive this idea. Right now, an investor named Ronny Shmoel is trying to bring Circuit City back to life as a physical presence, roughly seven years after the brand was mothballed. Here's my question about all this: Does the format really make sense at all? When it comes to selling electronics, can we do better? That's a tough question, one I'll try to unpack in today's Tedium. — Ernie @ Tedium

37k

The size, in square feet, of the average Best Buy store, according to an estimate reported by The Star-Tribune in 2012. That's relatively small compared to the average Walmart SuperCenter, which takes up 179,000 square feet. Both Walmart and Best Buy have been experimenting with smaller formats so as to hit different markets.

We Were Selling Computers All Wrong

(via YouTube)

The tale of DIVX, Circuit City's horrible alternate-universe Netflix competitor

In 1997, Circuit City was perhaps at the peak of its powers as a retail chain, and it was in a position to help shape the way we used technology.

Problem was, the chain's big foray into the world of hardware production was horrifyingly inept, driven by film-industry interests, and put into place in partnership with a law firm. A law firm!

(That law firm, Ziffren, Brittenham, Branca & Fischer, is one of Hollywood's most powerful, by the way.)

But that 1997 announcement of Digital Video Express, or DIVX, was an attempt to reshape the home video industry in a way most beneficial to the movie studios. The strategy, based on the then-new DVD technology, involved the use of encrypted discs that can be used over a short period of time, but become re-encrypted after the availability window ends.

"The DIVX disc is never returned, and so the consumer never has to pay late fees. The disc becomes part of the consumer's home video library, with additional viewing periods easily purchased through the Divx player," a 1997 press release explains. "Consumers also will be able, through the player, to convert many titles to unlimited viewing for a one-time fee, and certain titles will be available for purchase in the store as unlimited viewing discs."

Almost immediately, the idea took on a level of hatred that could only be compared to the way that some people talked about Internet Explorer back then. A number of anti-DIVX sites soon showed up on the scene, with passionate screeds that earned notice in the mainstream press.

John Giberson, a Texas resident, earned media notices by launching the "National Organization to Ban DIVX" soon after the platform's announcement. So assured was he that everyone else would dislike the idea of DIVX that he simply linked to Circuit City's website and told his readers to get informed.

"[DIVX] makes early adopters of DVD mad," Giberson told the South Florida Sun-Sentinel in 1998. "I think it's a bad idea and so do about 500,000 other people that have hit my Web site. I want to see DVD take off and stay as an affordable product for the mass market."

Giberson and other voices soon helped set the conversation, helped along by the fact that Warner Bros. and Sony both opposed the format. Additionally, once you broke down the value proposition, it was clear that buying a DIVX disc wasn't as good as going to Blockbuster, made problematic by an unforced error on the studios' part: They didn't include any of the DVD extras on the DIVX discs, and worse, panned-and-scanned everything, meaning that the integrity of a DIVX offering couldn't compare to standard DVDs.

Another factor was simply logistical: Circuit City was the only firm selling DIVX throughout most of its life, and Circuit City was a big-box store. It generally wasn't as close to your house as the average Blockbuster was, and its hours weren't quite as good, either.

But the biggest problem was that it cost Circuit City a lot of money, and that's what did it in. In mid-1999, the DIVX debacle was shut down, with Circuit City having dropped $233 million in the hole with little to show for it beyond a lot of angry customers.

Oh, and this training video. They got a training video out of it.

It's just one example of the problems that arise when you treat electronics like appliances.

We Were Selling Computers All Wrong

(Kalebdf/Flickr)

How CompUSA's treatment of Apple fans shows just how poorly suited the big box is for technology

In 1997, Apple was on the ropes, and needed a new distribution deal to get its products to the public. It went to the arms of a partner that it was familiar with.

But ultimately, the lessons it learned from the experience led the computer maker to forge its own path.

In 1997, the tech superstore CompUSA agreed to create a store-within-a-store concept for Apple, allowing the company to put a sales rep on the floor who could properly help Apple's own customers make decisions.

We Were Selling Computers All Wrong

(via YouTube)

"We believe in the future of Apple, and we are making a large investment to show our recommitment to Apple," CompUSA President and CEO Jim Halpin said in a press release at the time. "These specialized departments within our Superstores will offer a superior buying experience for the Apple customer."

Problem is, Halpin's concern apparently wasn't communicated down to individual employees. The Apple enthusiast site MacsOnly ran a page charting various consumer experiences at CompUSAs around the country, and there were a lot of situations where the staff didn't really understand the platform. Understandable, to a degree, because it was during a period when the Mac had slim uptake. But not for a company that was supposed to be actively promoting Apple.

Even famed technology columnist David Pogue, then a writer for MacWorld, had trouble. He went to half a dozen locations around the country, dealing with CompUSA employees at each location who seemed to have little to no interest in selling Mac-based hardware or software.

"Whatever the excuse, CompUSA simply isn't holding up its end of the sweet Apple deal," Pogue noted. "In steering potential buyers away from Macs, store clerks make a mockery of CompUSA president Jim Halpin's 1997 promise to make his stores 'the Apple headquarters for America.'"

So no, Apple couldn't even get a decent presentation from a big-box store, so it eventually started looking into its own retail outlets. Those outlets, smaller and more specialized, did the job much better than a big box ever could.

The ceilings are nowhere near as high, either.

"Stores will still be necessary to sell products and services and to teach people how to operate in this new digital civilization."

— Carlos Slim, the Mexican billionaire, discussing his 2000 purchase of CompUSA for $800 million. Despite that seemingly sound bet, CompUSA faltered by 2007, with Fortune characterizing Slim's ownership of the firm as "a rare misstep" for the world's richest man at the time. Turns out, stores apparently weren't necessary.

When CompUSA closed its doors in early 2008—earning a brief mention in my piece on liquidation from about a year ago—the company gained a bizarre reputation on consumer blogs for attempting to sell products that did not work.

The Consumerist, then owned by Gawker Media (which did not have a liquidation sale), kept an eye on the bloody toll. DVD players that were labeled "defective," were being sold for $179.98. Shattered LCD screens, despite being no good, were sold anyway. People, attempting to buy stuff with cash, were turned away by store staff.

In some ways, I wonder if the model for the computer superstore deserved to be left on the shelves in liquidation.

These stores happened to grow into the behemoths they did not just because the computer industry grew, but because, in many ways, they were competing with every other type of retailer. Bed, Bath, and Beyond has huge stores, too; so does Michaels, IKEA, and even Old Navy. Computing as a medium is simply too mainstream, and a tiny Radio Shack store wasn't going to cut it.

One problem, though: For decades, people who sold computers never really figured out what the retail model wanted to be. Does it want to be built around heavy customization? Does it want to be an intensely personal experience? Should you get the chance to touch the device you're about to buy, or are you better off just grabbing it in the mail after the fact? Should salespeople be up in your face or should they get completely out of the way?

It took probably 20 years to get a reasonable answer to most of those questions.

But big-box electronics retailers had to start somewhere, and rather than treating gadgets like some of our most important possessions, retailers long sold them like just another appliance. Computers are not washing machines—we work with them far too intimately, and we shouldn't sell them the same way. The fact that computers are built through the same sorts of industrial processes that make washing machines must have confused someone down the line.

With their boutique stores, Apple, Microsoft, and other big tech firms have figured that out. And even if the context is all wrong, it doesn't mean the success of an electronics superstore isn't possible: Best Buy is doing OK right now, but for a while, it was looking pretty shaky.

Sure, Amazon could still eat their lunch—they've done it before. But I wonder what a totally rethought Best Buy would look like, unshackled from its roots as a stereo retailer.

Like the devices they sell, electronics retailers can occasionally use a reboot.

The Halloween-Industrial Complex

$
0
0
The Halloween-Industrial Complex

Today in Tedium: When a retail chain or store inevitably shuts down, it leaves an open sore in the real estate sector—with no more Blockbuster nights to be had nor any fresh tenants to be found (perhaps mattress stores?) there has to be someone to fill that hole cheaply, quickly, and without much fuss. Enter the Halloween industry, which offers a nice balance of seasonal appeal and consistent public interest. Today's Tedium is about the pop-up Halloween shop, the zombie of modern retail. It's scary how well this model works. — Ernie @ Tedium

$8.4B

The amount that Halloween is expected to scare up in retail revenue this year, according to the National Retail Federation. Spending on Halloween costumes alone is a $3.1 billion economic drive, according to the retail group. While that spending is significant—and a big jump from prior years—it's actually somewhat modest compared to some other holidays. Valentine's Day, a similar holiday where people spend money, was anticipated to drive $19.7 billion in spending, according to the NRF, and Father's Day $14.3 billion. (Maybe it's because candy costs less than golf clubs? Just spitballing here.)

The Halloween-Industrial Complex

(Kimco Realty/Flickr)

How a dress-seller noticed a trend and created an entire industry around Halloween

"I didn't invent temporary sales. But I feel like I invented temporary Halloween."

In the early 1980s, Joseph Marver had a problem. The San Francisco-based dress retailer couldn't for the life of him convince anyone to buy his dresses. But he did see that many of his potential customers were passing up his store to go to a nearby costume shop.

That shop eventually gave Marver an idea that would change the way we celebrate a major holiday for good. According to a Newhouse News Service piece from 2000, Marver put his dresses away for a month, and started selling costumes of his own. Sales went through the roof, of course.

The next year, he ditched his dress store and put up a display in a mall, and had even more sales success. Soon enough, he was running with the Halloween store concept all along the West Coast.

Meanwhile, on the other side of the country, another store, New Jersey's Party City, was quickly figuring out the same thing: that Halloween was a huge draw for customers, and that they should program their entire retail cycle around Halloween, not Christmas.

Party City eventually grew into the world's largest chain of party-supply sellers (with an unusual lean on Halloween, which made up a quarter of the company's sales in 2014), but it was Marver's concept, known as Spirit Halloween, that proved the most unusual and influential.

Marver's chain, which carries on average more than three times the number of Halloween items a traditional store might, grew quickly in the West and Midwest, eventually drawing the notice of another prominent novelty retailer, Spencer Gifts, which grabbed it in 1999.

From there, the chain—and the basic concept—exploded.

The Halloween-Industrial Complex

(Elmer Boutin/Flickr)

Five reasons why Halloween stores work so well in pop-up form

  1. Buildings stay empty for long periods of time: It's harder to find a new tenant for a retail building than it is a home, and as a result, shopping centers often have empty spaces for a number of months or even years before a replacement appears. CityLab notes that Spirit Halloween and other chains take advantage of these market weaknesses by maintaining strong relationships with commercial real estate owners, who would much rather accept a temporary lease than no lease at all.
  2. Halloween stores are shape-shifters: If you have a big-box space available, Spirit Halloween can make it work, but if all you've got is a box the size of a Radio Shack, it's more than enough to sell Halloween goods. On its website, Spirit says that it can make spaces as large as 50,000 square feet and as small as 3,000 square feet work for its needs. The real issue, says the chain, is that there needs to be a significant amount of car traffic in the area, along with "awesome visibility."
  3. Inventory can be reused: Unlike technology or fashion, most Halloween gear doesn't really go out of style, which means that it can be returned and reused repeatedly over time, keeping production costs low. Franchisees benefit from this setup, because they can pay a deposit on the merchandise before launching the store, then sell back the goods to the Halloween store chain and receive merchandise credit from Spirit. “This way retailers don’t have to fund all of the inventory in advance and they don’t have to carry inventory over to next year either,” Spirit's Sullivan told Specialty Retail.
  4. The pop-up nature allows for planning time: Spirit Halloween founder Joseph Marver noted that, because had so much lead time between the beginning of the year and Halloween, he was able to spend much of the year plotting for the year's big trends. If Spider-Man was going to be big that year, he was able to get a Spider-Man costume in production. But you always have to plan for the unexpected hit, however. "You'd better have some money left over for sleepers—movies you didn't know would be a box-office smash and kids were going to want," Marver added in his comments to Newhouse News Service.
  5. There's another big seasonal holiday immediately after: Just two months after Halloween is Christmas, another kind of holiday that also works well in a retail context. Halloween Adventure, a smaller Halloween store chain, has been known to convert the locations to Smart Toys stores, which are intended to jump on the Christmas trend.

The Halloween-Industrial Complex

(cjbird88/Flickr)

How the recession created the perfect opportunity for the Halloween store

When regular retail is doing poorly, the Halloween store is in a position to do really well.

Case in point: The increased presence of the Halloween store in the years after the 2008 of the economy, when regular retailers were leaving homesteads at larger locations. It's bad enough when a boutique store leaves the mall, but when the Sears leaves, it becomes a huge problem.

In the years after the recession, two trends happened: One, Halloween stores became a lot more common around the country, and the stores started taking up larger and larger spaces. During its 30th anniversary in 2013, Spirit Halloween had 1,050 locations—a huge surge from the five dozen or so it was running when Spencer bought it in the '90s. (For comparison's sake, J.C. Penney currently has 1,063 locations and Toys "R" Us has 1,132 locations.)

The reasons that these Halloween chains grew so quickly around the recession is best explained by the death of chains like Circuit City and CompUSA. Both died around roughly the same time—just before or during the recession. Both companies left behind massive real estate spaces, taking up tens of thousands of square feet each. And, because it was the recession, odds were low that new tenants were going to take their place anytime soon.

Vacant big-box retail spaces are simply harder to rent out than smaller ones.

The Halloween-Industrial Complex

(Daniel Oines/Flickr)

As a result, Spirit Halloween took over 83 former Circuit City locations in 2009, the year that the famous electronics chain closed.

Temporary Halloween stores were once seen as bad business, somewhat of a blight on the retail landscape, but the recession basically forced commercial real estate owners to stop being so picky.

And there's even a chance that the success of a Halloween store might set the stage for a full-time tenant. Party City is a prominent part of the temporary-store trend through its Halloween City subsidiary. According to National Real Estate Investor, the publicly traded company uses the Halloween stores as a testing ground: If a temporary Halloween City store does particularly well at a location, Party City may decide to open up a permanent location at that spot.

By re-animating mummified retail spaces, Halloween City, Halloween Express, Spirit Halloween, and others are doing the economy a bit of a service. They're offering the public something it knows they'll like, while giving real estate companies an opportunity to get a modest return on their investment.

The Halloween pop-up store phenomenon is very much still with us today—this year, Spirit Halloween is up to 1,150 locations. And the trend may not fade anytime soon.

However, the success of the model is likely to become harder and harder as the years go on, not because people aren't interested in Halloween, but because retail is doing a lot better these days.

CityLab notes that around 2014 or so, the shopping centers and strip malls have started to fill up again, in part because new malls aren't being constructed fast enough to keep up with tenant demand. In fact, the International Council of Shopping Centers literally reported this very point, based on research from the real estate firm JLL, back in December.

That means that it's going to cost more for a Halloween chain to rent out a retail crypt in the years to come—because there are fewer crypts than usual these days.

But even if ghosts are going to be harder to find in the future, it doesn't make it any less spooky once you do find one.

The Internet's First Election

$
0
0
The Internet's First Election

Today in Tedium: For the past 20 years, a familiar trend has exposed itself with the turn of every U.S. presidential election cycle: Each one is ever-slightly more defined (heck, even redefined) by technology, particularly online trends that have come along in the three years prior. In 1996, it was a coming-out party for the web. In 2000, the growth of online news began to have an impact on how we researched candidates. In 2004, it was all about Meetup, as well as suddenly prominent individual bloggers like Charles Johnson of Little Green Footballs, whose dogged work in debunking a questionable report changed the shape of Dan Rather's career. In 2008, fairly new outlets like YouTube and Politico became immensely influential. Twitter was around back then, but it wasn't until 2012 when it came into its own, along the big data campaign. This year, in many ways, has brought the messy culmination of all these trends, along with a few others (Facebook Live, anyone?). But I'm curious about how online networks affected elections before the internet had such a big impact on them, before cable networks teamed with social networks to promote primary debates. Today's Tedium goes back to the 1992 campaign, in a year when backwards pants, Temple of the Dog, and Ross Perot were still in style. — Ernie @ Tedium

The Internet's First Election

(via the National Communication Association)

Five interesting ways technology influenced the 1992 presidential election

  1. Prodigy, the early online service that directly competed with AOL for a time, launched a 1992 campaign database for users to track candidates. The effort came about thanks to a collaboration with the League of Women Voters. Even better, Prodigy allowed you to write your representative electronically—well, kinda. "If you want to get your view across and write to your representative, you can write a letter on the computer screen," The Washington Post noted in February of 1992. "Prodigy will print and mail it for a fee of $2.50."
  2. For his 1992 primary campaign, current California Gov. Jerry Brown innovated by using a 1-800 number to solicit donations. Sound kind of quaint? Don't be fooled: This was a Big Deal in 1992, as it hadn't properly been utilized by candidates previously. As the San Francisco Chronicle learned back in 2013 (and I just confirmed), the widely disseminated number, (800) 426-1112, is still active and still owned by Brown, though it's no longer accepting donations.
  3. Jerry Brown also used Compuserve to reach voters, but so, too, did the Lincoln Chafee of the 1992 campaign, former Irvine, California, mayor Larry Agran. Agran, who didn't last beyond New Hampshire, held online Q&A sessions on the early online network, with Bloomberg noting that Agran would speak out answers to the online questions out loud, while a transcriptionist would type the answers into the computer.
  4. Usenet! Freaking Usenet! The 1992 campaign was a hot topic on Usenet, the decentralized newsgroup system which is best described to those who never experienced it in person as the Reddit of its day. There were groups for all of the major candidates, as well as ample evidence that people didn't like Bill Clinton even back then, and Ross Perot was a hot topic way back when. (Side note: Many newsgroups from the era are still active, including one for Rush Limbaugh.)
  5. MIT-programmed mailing lists: In 1992, the Massachusetts Institute of Technology ran a number of email-driven bots on the campaign92.org domain, allowing users to request position papers for any campaign on the ballot in at least half of U.S. states—which meant Libertarian Party candidate Andre Marrou and Natural Law Party candidate John Hagelin got mailing lists, too. (Not that Hagelin got any respect—the MIT press release at the time called him "Larry.") MIT estimated that the in the days before the 1992 election, it was sending out 2,000 emails a day through the accounts.

"One little-noticed development that illustrates the interactive nature of modern technology is the use of electronic mail. During the general election campaign, the text of all Bill Clinton's speeches as well as his daily schedule, press releases, and position papers were made available through on-line computer services, such as Compuserve and Prodigy."

— Dee Dee Myers, Bill Clinton's first White House press secretary, discussing in The American Behavioral Scientist, an academic journal, how the Clinton campaign pioneered the use of online communications during the 1992 campaign. Myers characterized the endeavor as democratizing what would have previously been private pool reports. "For the first time, ordinary citizens had an easy way to obtain information that was previously available only to the national press corps," she noted. "Instead of seeing an 8-second sound bite chosen by a network producer, voters could read an entire speech." Clinton later became the first president to launch an official email address—president@whitehouse.gov, of course—and website. (In case you're wondering, George H.W. Bush's use of the internet during the 1992 campaign was much more, uh, conservative—limited, according to the Routledge Handbook of Internet Politics, to emailing policy statements and speech transcripts to bulletin boards.)

The Internet's First Election

Ross Perot (Wikimedia Commons)

How Ross Perot's 1992 campaign helped pave the way for the Internet Archive

Last year, the Internet Archive's Jason Scott uncovered a piece of archival material so obscure that even the organization's founder, Brewster Kahle, didn't remember creating it. The two-hour-long video, previously unmarked and with fewer than ten views at the time Scott discovered it, features an interview with Kahle that dates back to his days as head of WAIS, Inc., an early attempt to create a broad online database.

WAIS, which stands for "wide area information server," was effectively an internet-enabled database technology that allowed for long-distance access of digitally organized content. Initially borne in 1989 from a research project, it became something of a footnote in the history of the internet as a combination of the World Wide Web and search engines like Google took its place. But the thinking in its design was, in some ways, fundamental to the way we use the internet now.

The technology, at the time, was seen as particularly valuable—especially in the days when it wasn't clear how we would eventually access data online. Spun off from the supercomputer firm Thinking Machines in 1992, WAIS, Inc. represented one of the first attempts to make the internet user-friendly.

And, as a company, it got its start thanks to a presidential campaign—Ross Perot's, to be specific. As Kahle states nearly an hour into the video, the campaign basically was the reason why the company was created when it was. In fact, Kahle described it as a "fortuitous event": He knew Perot Systems head Mort Meyerson, who had a sudden need for an information-organization platform for the campaign of the company's namesake.

The Internet's First Election

Brewster Kahle, circa 1992. (via the Internet Archive)

"Basically they've got this organization of people that are in 50 states, that is ad-hoc, that has three months to live, that has to keep in touch with each other," Kahle explained in the clip. "So they have lots of information coming from the field, and they have, they need to collect up in a central place, figure out what it is they're trying to do, what the statements, positions of Ross Perot are, and to disseminate that information out again. Paper, or having people fly back and forth, just wasn't fast enough. Electronics made a whole lot of sense within a campaign structure."

The high-tech approach considered by the Perot campaign makes a lot of sense. As is well-known, Perot had a significant background in technology, as the founder of Electronic Data Systems, a information technology management firm that dates to the 1960s. (He also famously invested in Steve Jobs' NeXT.)

Kahle's system, put together in a couple of days, impressed political strategist and campaign manager Ed Rollins, who quickly put the system to work for the campaign. Soon after, Rollins quit the campaign, and Perot famously suspended his effort before eventually hopping back in. This created a problem for Perot Systems, which had this impressive tool at its disposal that it no longer could use for its intended purpose—organizing news clippings and press contacts.

(If you know where to look on the Internet Archive, you can actually find the proposed statement of work describing this effort to create "an Intelligence System for the Perot Campaign.")

Eventually, with WAIS Inc.'s help, Perot Systems decided to eat its own dog food.

"Basically Perot Systems' point was, 'What could Perot Systems use this for?' in terms of selling it to their customers," Kahle explained. "So we installed it within Perot Systems to help them run their own corporation. And that, now it's November of '92, we've basically installed it and got it running for six months. The pilot groups are using the system for all sorts of information."

Soon enough, WAIS, Inc., off the back of client work like the Perot campaign and other projects, became a successful, profitable business, one that was only slightly ahead of its time—and close enough in conceit to the web that it basically predicted the internet's future.

As the internet started to take a more formalized shape, Kahle and other stakeholders sold WAIS, Inc. to AOL, and with the money earned from the buyout, he and fellow WAIS co-founder Bruce Gilliat started two projects that are still active to this day: Alexa, an online search and analytics tool that crawls the internet's many websites, and the Internet Archive, which keeps the internet's memory safe for generations to come. (They eventually sold Alexa to Amazon for $250 million in stock. It's unrelated, except perhaps in spirit, to Amazon's Echo-driving AI project of the same name.)

As the 1996 election was just ending—with the internet's role in future elections secured, thanks to sites like Clinton/Gore '96 and Dole/Kemp '96, Kahle was getting a start on the project that would become the internet's ultimate scrapbook.

On October 26, 1996, the Internet Archive began in earnest, helping to collect big statements and tiny wrinkles alike on the internet. Yes, that means their 20th anniversary is coming up next week. (Did you know they accept donations?)

That broad swoop of data collection extends to the world of politics. These days, the Archive has in some ways democratized the basic ideas that Kahle's campaign machinery was built upon back in '92. The nonprofit's Political TV Ad Archive collects every political ad that has run during the 2016 campaign season (including links to fact-checks of the ads), along with granular data about each of the presidential debates—the latter with the help of the Annenberg Public Policy Center. What WAIS was doing for Perot's campaign way back when, the Political TV Ad Archive is expanding upon—and doing for the public at large.

This time of year in the '92 election, Ross Perot was making some unusual left-field moves in an effort to topple Clinton and Bush–memorably, he ran a series of prime-time infomercials ahead of the election, which he paid for out of his own pocket because he's Ross Perot. Perot didn't win in '92, in part because his decision to drop out of the race, then return, didn't sit well with some.

But just imagine how effective those ads could've been if they had the full power of Brewster Kahle's database behind them during that three-month break Perot took.

Own To Rent

$
0
0
Own To Rent

Editor’s note: Today’s issue is created in partnership with the folks at Make Room, a nonpartisan organization sponsored by the housing nonprofit Enterprise Community Partners.

Today in Tedium: There’s a lot of country hiding between New York and San Francisco, but for some reason, these two cities dominate discussions about rent prices in the United States. It’s understandable. It’s pretty easy for your eyes to pop when you read about a 1-bedroom apartment’s price topping $3,500, as ApartmentList reported about San Francisco earlier this month. But what about the rest of the country? Are things out of whack, and how do we get them back into whack? That’s a complicated question to answer, but today’s Tedium tries its best to answer it. — Ernie @ Tedium
Own To Rent

Make Room gives voice to struggling renters and elevates rental housing on the agendas of our nation’s leaders. We’re advocating for better policies and telling the stories of real families who can’t make rent today.

Today's issue is sponsored by Make Room. (You can sponsor us, too.)

1.3M

The anticipated number of newly housing-insecure renters over the next decade, according to projections by the Enterprise and the Joint Center for Housing Studies. And that’s only if rent growth matches the expansion in income. That traditionally hasn’t been the case.

Own To Rent

(Unsplash/Pixabay)

If you compare historic rents to the rate of inflation, something startling emerges

Shelter is an essential thing—up there with food and water. But while those other essentials remain relatively inexpensive, the cost of a proper housing has surged over the years.

But to really get a grasp of how dramatic this surge has been, it helps to look in a historic context: How far would a 1940s-level rent get you nowadays?

In 2013, the real estate blog Curbed pondered this question, by doing an analysis of rent based on historic Village Voice classified ads. In the 1940s, a New York City apartment in a decent neighborhood could be had for about $50. What would that $50 get you now?

In the 1940s, you could get a place on the Lower East Side for less than $50, according to the analysis. These days, with inflation considered, that $50 equals out to between $780 and $830. The average rent in NYC is $3,800—and that $800 or so will barely even get you a shared apartment.

"To find a private apartment for the same $800 price range, one must go to the far reaches of the city," the website’s Jessica Dailey wrote.

But even outside of New York, rental rates have traditionally outpaced the rate of inflation. Between 2010 and 2014, U.S. Census Bureau reported the median gross rent nationwide to be $920. In comparison, the pre-inflation rate of rent in 1940 was $27—an amount that would have been the equivalent of $456 in 2014.

Sure, plenty of things have changed since then—our definition of basic utilities has expanded to include cable television and internet, we expect air conditioning, and there are tougher regulatory and maintenance standards than there were 76 years ago. But when it comes down to it, the price of rent nationally has more than doubled.

Even in states with rents below the national average, the increase has been fairly dramatic. In 1940, for example, Mississippi was the least expensive place to rent in the country, with places going for an average of $11 per month, or $186.01 in 2014 numbers. That year, the median rent in Mississippi tallies at $714, nearly four times as much as the inflation-adjusted 1940 rent could buy. (And no, Mississippi is no longer the cheapest state to rent in; that honor goes to West Virginia, or if you want to expand to official U.S. territories, Puerto Rico.)

Certainly, cheap rents can still be found in some places—if you’re willing to live in Toledo, Ohio, you can get an apartment for $638 per month, making it the lowest rent in the country in a city with a population topping 250,000, according to American Community Survey estimates gathered by FindTheHome. Problem is, the per-capita income in that area is $19,113—meaning that the rent would account for 40 percent of the average Toledo resident’s income if they were living by themselves.

Plenty of people in Toledo probably make a lot more than $19,113. But plenty more probably make less. And looking at things in recent terms highlights just how problematic this trend really is.

21.3M

The number of households that spent more than 30 percent of their income on rent in 2014, according to research by Enterprise Community Partners and the Joint Center for Housing Studies—a level that is considered "moderately rent burdened." Of those, 11.4 million household spend more than half of their income on rent—a level considered “housing insecure.”

Own To Rent

(turkeychik/Flickr)

Why things are so out-of-whack for renters

"Affordable housing" is often bandied about as a buzzword in the world of real estate, despite the fact that it’s considered essential for many socioeconomic groups by the U.S. Department of Housing and Urban Development.

The problem is, it’s often not reflected on the ground. In some ways, it feels like a myth, like you’re given a set of options—safe, inexpensive, convenient, sizable—and told to choose just two. You’re more likely to see an ad for a fancy apartment you can’t afford on your just-scraping-by salary than one that’s suited for your needs.

Now, here’s the crazy thing about this: The federal government has a set of metrics set aside that are supposed to allow for rental assistance for people who need it—in the form of Section 8 rental assistance vouchers and Low-Income Housing Tax Credits. These programs keep people in their homes, and bring a bit of stability to what can often feel like a losing situation. When income isn’t inflating as fast as rent is, this kind of help can be a lifesaver.

The problem is, just 23 percent of Americans who qualify for such programs actually receive this help, according to the National Bureau of Economic Research. Part of the problem is funding, part of it is legislative will.

Slightly higher up the economic food chain, the U.S. government actually assists people with housing on a fairly consistent basis. These programs have definite value—they put homeownership within reach of people who otherwise couldn’t afford it, but the problem is that the calibration is somewhat out of whack: Households that make more than $100,000 per year are much more likely to receive home mortgage interest and property tax deductions than those that make less than that. In fact, three-quarters of these subsidies go to households that make more than $100,000, and more than a third to households with incomes above $200,000.

This year, the federal government will put $140 billion into programs designed to help homeowners. It’ll put just $55 billion into programs designed to help renters. In an era when renting is becoming more common, the gap raises way too many questions.

Anyone who’s lived in a city knows that the vibe and overall culture of that city is the sum of its parts. You lose that vibrancy when everyone who wants to live there can’t. By putting most of the weight of housing policy on ownership rather than rental, it limits the community’s long-term potential.

And that’s a bummer—the person pouring your drink deserves an easy commute just as much as the person who owns the bar.

Houses are great investments. If you can knock out a 30-year mortgage, you’ll eventually have something that’s in your name that holds immense value. But sometimes, purchasing a home isn’t a good option. Say, for example, your chosen career path as the next Lin-Manuel Miranda means you’ll have to spend some time as a starving artist before you can sell out Broadway night after night and make a star turn on Saturday Night Live. Or maybe you have a career path that requires you to move every couple of years.

Or maybe you’re on a lower socioeconomic tier than other people in your community, and the idea of buying a house feels like a pipe dream. If the rent is too freaking high, how are you ever going to get ahead?

Let’s put this another way: You’re familiar with the concept of subprime mortgages, right? Well, those who found themselves underwater as a result of bad mortgages, guess where they went to live after their house was foreclosed upon? I’ll give you a hint: They most likely didn’t purchase another home after that. They probably went back to renting.

In fact, the numbers back this up: The U.S. homeownership rate has fallen sharply since 2004, with the U.S. Census Bureau reporting back in July that the homeownership rate during the second quarter of 2016 was 62.9 percent—a nadir not seen in 51 years. Meanwhile, renting has become much more common—as CityLab's Richard Florida reported back in February that the trend has sharply favored the renter, especially among millennials, with the rental rate among 18 to 34-year-olds sitting at 71.6 percent in 2014—a jump of roughly 9 percent from 2016.

The federal government has long helped homeowners pay for their homes through subsidies and other forms of assistance. Now, if it’s a renter’s market, why isn’t the federal government—or heck, why aren’t state and local governments, which have a more direct effect on local market rates—doing more to help renters?

What can we do to change this? We can start by sending a message to federal, state, and local representatives. These are the officials we’ve elected into power—we need to hold them accountable to address this crisis and start to put policies into place that work.

Let's face it: The odds are good that if you're feeling the pain of high rents, so are most of your friends. Why shouldn't you speak up?

Such Great Heights

$
0
0
Such Great Heights

Editor’s note: Today’s issue is a piece from writer Andrew Egan, who had a great story he wanted to share about something really risky. It took him to a few interesting places. Here’s his story.

Today in Tedium: Risk is a part of everyday life. Even basic tasks, such as commuting to work or eating, involve a minimal amount of danger. Yet some people prefer activities that involve an inordinate amount of risk. Is that the whole point? In this issue of Tedium, we head to Montana (and Idaho, and Utah, and Arizona, and Nevada), on the trail of a relatively new extreme sport, to examine if risk is really all that risky. — Andrew @ Tedium

"These are not holidays. These are adventures and so by their very nature extremely risky. You really are putting both your health and life at risk. That's the whole point."

— A warning quote from the Adventurists, organizers of the Icarus Trophy, a cross country paramotor race. The event is part of a host of adventure travel offerings that also includes a horse derby across Mongolia and a car rally through the Gobi Desert. Icarus pilots had to pay a $2,000 fee and cover their own expenses during the 14-day race.

Such Great Heights

Trey German being interviewed at the finish line in Mesquite, Nevada. (all photos by Andrew Egan)

The kind of person who flies with only a lawnmower engine and a fancy bedsheet

On a beautiful Sunday morning in early October, a small collection of pilots gathered in Polson, Montana for a race. The term pilot might be a little misleading for laymen as the aircraft they fly aren't regulated by the Federal Aviation Administration (FAA). A powered paraglider, also called a paramotor, is one of the cheapest and most basic ways to fly. To most seeing one for the first time, it also looks like a batshit crazy thing to do.

(The Toll Road/YouTube clip)

Put simply, a paramotor is an engine with props (spinning blades) and a fabric wing. So, basically, a lawnmower engine and a fancy bedsheet. To take off, pilots turn on their motor, catch wind in their "wing", and start running. If the conditions are right, they quickly gain speed and, hopefully, take flight. Of course, that doesn’t always happen. The altitude of many of the landing points during Icarus exceeded 4,000 feet. The air was thin and face plants were common.

"The air changed on me," Dean Kelly explained after a failed takeoff attempt outside of St. Ignatius, Montana. An Australian that's piloted paramotors for over two years, Kelly cracked two props and damaged the outer cage surrounding his motor. Fortunately, he was not hurt.

For the uninitiated, the risk associated with paramotoring often seems too great. James Borges, a pilot from the United Kingdom, explained the appeal by making an ‘X’ with his arms, symbolizing a risk/reward graph.

"It’s not really about risk, we try to manage risk as much as possible. If we do that, we get to fly in a way few people experience."

One of the few studies on extreme sports athletes found that emotional management, especially of fear, is common. Participants are eventually rewarded for successfully managing risk and fear with a healthy dose of adrenaline and dopamine, the chemicals in the brain responsible for sensations of happiness and satisfaction.

"Dopamine plays an important role in the reward- and motivational systems in the brain, and high levels of it leads to feelings of well-being," an overview of the study reads. “...The experience of fear induced by risk may be compared to the response people have after surviving dramatic incidents such as serious illness, car accidents or traumas. People often report that these experiences change their lives. Such experiences may in the longer run lead to personal development and increased appreciation of life.”

Talking to a paramotorist after a flight involves a fair amount of high fiving and handshaking. Phrases like, "pumped", “Yeah boy!”, and “holy fucking shit man” are routinely bandied about. It takes at least fifteen minutes for the excitement to reduce to a simmer. Attempts to conduct proper interviews after flights were more or less pointless. Pilots, for whatever reason, tend to downplay the role adrenaline plays in their decision to paramotor.

"When you first start, oh yeah, there's always an adrenaline rush," said Trey German, an engineer from Texas that's been flying paramotors for nearly three years. “After a while, you really only feel it during maneuvers or extreme conditions.”

Which might explain why these pilots would choose to participate in the Icarus Trophy, claimed by its organizers to be the world’s toughest air race. The 2016 version of the race started in Polson and ran 1,100 miles through five states before ending near Las Vegas. This course was 300 miles longer than the previous year’s, which ran from Seattle to Sacramento.

"During one flight, I experienced hail, snow, rain, and turbulence," Kelly said about the experience. “I had never dealt with those conditions before and I got to deal with them all at once.”

7

The total number of pilots that flew during Icarus 2016. All but one finished. The 2015 race featured 33 pilots, many of which were injured, including a severely sprained ankle and a broken wrist.

Such Great Heights

A paramotor pilot flying at sunset over the Bonneville Salt Flats in Utah.

Snow and Sleet and the Smell of Feet: A view of Icarus from someone who followed along

In its second year, the Icarus Trophy tormented competitors and their support staffs with cramped living conditions, unpredictable precipitation, and broken equipment. Some of the most experienced paramotor pilots in the world participate in Icarus but even they’re not immune to accidents. On the first day, Scotty Duncan, a well known Australian pilot, blew an engine and had to repair it with the help of a local machinist, who was also kind enough to serve him elk stew.

"It’s part of the experience and adventure," Kelly said after his failed attempt to launch on the first day. “Where’s the fun if everything goes right?”

Injuries during Icarus are all too common. The 2016 race was largely injury-free though most pilots dealt with tumbles and muscle strains. German even landed among thick desert shrubbery and had to remove his pants to pull out dozens of thorns. Still, these incidents pale in comparison to 2015 when injuries included a broken wrist, a severely sprained ankle, and a broken back. By one estimate of a competitor during that race, some fifteen percent received some form of injury. Clearly, paramotoring is a dangerous sport but exactly how dangerous is difficult to discern.

Some back of the envelope math done by German found that paramotoring is about as safe as riding motorcycles. One of the issues in determining the exact risk associated with paramotoring is the relatively small size of the community, not just in the United States, but around the world.

"There are maybe twenty or thirty thousand paramotorists in the world but ones that fly regularly is much less, maybe ten thousand." German quickly added, “And there are only seven pilots with the balls to take on Icarus.”

A few places on the Icarus trail

Such Great Heights

Polson, Montana: Start line of the Icarus Trophy, elevation 3,000 feet above sea level.

Such Great Heights

Idaho Falls, Idaho: The Falls of Idaho Falls. One of the larger cities along the Icarus route, Idaho Falls is also home to Idaho National Labs, the country’s only nuclear reactor test site.

Such Great Heights

Bonneville Salt Flats, Utah: Maintained by the Bureau of Land Management as a race track, the Bonneville Salt Flats weren’t technically on the Icarus route. A few of the competitors decided to make the drive for a day of spectacular flying. We arrived a day or two after rains left an inch of water across the salt plain. Several land speed records have been set at the Flats.

Such Great Heights

Moab, Utah: A beautiful resort town near Arches National Park, Moab is surrounded by beautiful red rock mountains and unique geological formations, like the Delicate Arch.

Such Great Heights

Monument Valley, Utah: In southern Utah, near the Arizona border, much of the area belongs to the Navajo tribe. This iconic landscape has been featured in dozens of films and TVs shows, including Forrest Gump and Back to the Future Part III.

Such Great Heights

Downey, Idaho: A small town in Southern Idaho with a grass strip airfield, Downey forced competitors to race against weather and make decisions that could have fatal consequences. Miroslav Svec is pictured here attempting to beat an incoming storm to the mountain.

Motorcycles are probably the best analogy to understand paramotors. As a concept, neither are particularly dangerous unless speed, terrain, or trick maneuvering are involved. However, even experienced pilots of fixed-wing aircraft (a regular airplane) often think the paramotorists landing at their airports are a bit crazy. Throughout the race, pilots routinely cautioned the competitors from taking off due to weather. They went anyway.

"Challenging yourself to do new things is a big part of Icarus," German said. “If this were easy and without risk, it wouldn’t be as fun or memorable.”

After traveling with the Icarus competitors through five states and more than a thousand miles, I’m only now starting to understand. That’s the whole point.

Andrew Egan is a writer living in Texas. He’s previously written for Forbes Magazine and ABC News. He just completed his first novel, Nothing Too Original, and his collection Drink Your Whisky Like a Man spent one week on Amazon’s American poetry best seller list. You can find his terrible website at CrimesInProgress.com.

The Original In-App Purchase

$
0
0
The Original In-App Purchase

Today in Tedium: Before software, purchases generally worked like this: You walked into a store, you bought a physical object, and that object was yours until it became damaged or outdated, and you threw it out. But software, being much smaller in physical space than any commercial object that came before it, wasn't limited by these rules. Data came in bits and bytes, and could be dripped out or distributed in any number of ways. And that data was getting smaller by the day. Floppy disks begat smaller floppy disks, which begat hard drives, which begat CD-ROMs. If you had a modem, you didn't even need another disk! It makes sense that shareware came out of this floppy-copyin' state of affairs, because we needed a business model that encouraged copying. Today's Tedium talks about the impact of shareware, the first of many new business models given to us by the software industry. — Ernie @ Tedium

"You're probably used to buying an expensive, commercially marketed program, taking it home and hoping it does the job. All too often, though, you find it falls short of your expectations. The shareware way lets you choose from a wide variety of high quality programs and try them all until you find the one you like best. Then and only then do you pay a low registration fee to the program author."

— An explanation of value of shareware in the introductory catalog for The Software Labs, a shareware-by-mail distributor. The fairly slick 1992 catalog, which can be viewed in all its wonderful glory on the Internet Archive (warning: large PDF file, but worth it; for those who want something smaller, I uploaded a trimmed-down version here), features a massive number of games, education apps, and graphics programs. On the other hand, it has just seven business apps—which should give you an idea of the overall balance of what kinds of developers were making shareware. Such catalogs were common in the early '90s, during a period when people who didn't have modems had to acquire software in meatspace.

The Original In-App Purchase

(ZZT screenshot)

Five notable pieces of shareware that redefined computing

  1. PKZIP: This compression utility, created by developer Phil Katz in the late '80s, compressed files so efficiently that it became a de facto standard that's used to this day. Katz, who created this app after a legal battle over the similar .arc format, smartly made the .zip format open, only charging for his implementation of the format. He had a hit on his hands. But Katz's life, as we highlighted last year, wasn't so easy. He died in 2000.
  2. McAfee Antivirus: Before John McAfee became the subject of, shall we say, interesting headlines, he innovated in the antivirus market by becoming the first person to offer such tools under a shareware license. The result was so effective that the software quickly caught up to its largest competitor, Symantec's Norton Antivirus—especially after he helped create a media frenzy around the Michelangelo virus.
  3. Wolfenstein 3D: Clearly, id Software's Doom and Quake (and Duke Nukem 3D, released by Apogee/3D Realms) followed in its footsteps, but the drip-out strategy of this famous game—which innovated at a level that topped many shrink-wrapped games of the era—proved to skeptical gamers that shareware was definitely not a second-class product. Doom is arguably seen as a more influential piece of software, but Wolf3D proved out the model.
  4. ZZT: The first game released by Epic Megagames is in some ways pretty much the opposite of what the firm, now known as Epic Games, is known for. (That is, the myriad versions of the Unreal Engine.) But in other ways, it makes a lot of sense. The game, which is completely built using ANSI graphics, stood out among game fans because of its built-in editing tools—which in many ways defined the company's future approach. Like Wolfenstein 3D, the success of this game set the stage for one of the defining game companies of the 21st century.
  5. Trumpet Winsock: For years, Microsoft had a blind spot toward the internet, but shareware filled the gap on Windows 3.0 and 3.1. In particular, this shareware implementation of the Windows Sockets API became a must-have for many users who wanted to try out early graphical web browsers like Mosaic and Netscape. Microsoft eventually figured out it needed a version of this and indirectly made of a whole lot of money from it. Alas, Trumpet Winsock creator Peter Tattam received very little in profits from his widely distributed software.

The Original In-App Purchase

(Photo.Ski/Flickr)

Three guys, three apps, one great idea: The story of shareware's creation

As a concept, shareware can be credited to three separate people, all of whom offered up different elements of the software offering that eventually became shareware.

In 1982, a computer developer and magazine editor named Andrew Fluegelman came out with a piece of communications software called PC-Talk, an early app that he offered up under a model he called freeware—a term he copyrighted, but has little in common with the free-for-the-taking modern definition of freeware. The software could be shared freely, but he requested that, if you liked it, you mail him $25—something he claimed was more economics than altruism. PC-Talk earned a rave review in PC Magazine, Fluegelman's employer at the time, own its own merit.

"An ingenious communications program composed by PC Associate Editor Andrew Fluegelman that is elegantly written, executes without quirk or mishap, and is free for the asking," columnist Larry Magid wrote.

Around the same time, an IBM employee named Jim Knopf (also known as Jim Button) developed a database app, sold under a similar model, called Easy-File. Soon, Knopf and Fluegelman met one another in person, and apparently out of solidarity for their collective models, Knopf renamed his database app to PC-File, and got the ball rolling on his company Buttonware.

In an essay on the creation of shareware, Knopf noted that the innovative sales strategy led to natural attention for his firm.

"Here was a radical new marketing idea, and the computer magazines were hungry for such things to write about," he explained. "The result: much free publicity for PC-File."

The next year, a Microsoft employee named Bob Wallace left that company and created a new one, QuickSoft, for which his first app was called PC-Write. Wallace, who helped other early software-makers with marketing, was ultimately the one who came up with the memorable name for the marketing strategy: Shareware. (It was certainly better than the alternative Knopf came up with, "user-supported software.")

The strategy quickly caught on, and other indie developers followed suit. Computer clubs around the country, wanting to offer their members new software, took advantage of the perks of shareware, which drove its growth.

As a result of all this growth, shareware eventually got organized. In April 1987, the Association of Shareware Professionals formed with the goal of ensuring that the distribution of such software was safe and that the rights of software makers were protected. The organization verified sources that distributed such software and created standards that independent software sellers could follow so they wouldn't get burned. If you ever run into a file called FILE_ID.DIZ in an zip file somewhere, you can credit ASP for putting it there—it's a text file describing what the shareware does.

Shareware, clearly, was an idea with legs—and those legs, ultimately, outran the three men who invented it. Wallace died in 2002, Knopf in 2013, and Fluegelman, most tragically of all, in 1985. The legacy of their work, in many ways, lives on.

"I have never been as socially involved, as interconnected with as many different kinds of people, as when I started getting involved with computers."

— A quote from a recorded interview with Andrew Fluegelman, as cited by Infoworld contributor Kevin Strehlo in a memorial piece about the developer and magazine editor upon his 1985 disappearance. Fluegelman's legacy, as highlighted in this clip, looms large on early computing: Beyond being an inventor of shareware, he was the founding editor of PC World and MacWorld. But his health proved troublesome. He had developed cancer and was suffering from colitis at the time of his disappearance. (He has not since been found and is presumed dead.) As a 1985 transmission on the Newsbytes news wire notes, Strehlo departed from the magazine as a result of this piece, because the magazine chose to cut the last few paragraphs of his column, which stated the circumstances of Fluegelman's disappearance and speculated it was a suicide. (Also worth reading is an interview with Fluegelman for the magazine MicroTimes, republished by journalist Harry McCracken on Medium.)

The Original In-App Purchase

How adware bastardized shareware's good name

Shareware wasn't perfect as a business model—clearly, more people used software for free than paid the piper—but, despite Fluegelman's initial claims, it was fairly altruistic. It kept small developers in the game and let computer users try out different kinds of software without spending thousands of bucks at Best Buy.

The problem is, all the altruism that went into the model at its launch didn't follow through to a second generation of developers.

During the early Windows XP years, the model broke down. A few factors were behind this, including the tendency for shareware-distributing websites to be the exact kinds of sites that would distribute downloads with adware—a problem that lingers to this very day.

Sometimes, this adware would take the form of programs like Gator, which claimed to have legitimate uses (it would save passwords for you), but also came with a pop-up ad network. (The company behind Gator denied being adware and changed its name multiple times before its inevitable shutdown.)

But more often, the adware just tagged along like an unwelcome guest, ready to clog up your computer with crap at a moment's notice. Almost as if to reflect this shift in shareware's reputation, the Association of Shareware Professionals changed its name to the Association of Software Professionals in 2010.

"It is no longer necessary to distribute software by sharing floppy disks or to pay with a check in the mail," the association explained in a blog post. "Many consumers, fairly or not, have come to see 'shareware' as short-hand for 'amateurish.'"

Sure, shareware still had its areas of success—Mac-based software developers like Panic and Rogue Amoeba did pretty well building up audiences during this era with software releases that were effectively shareware.

But overall, downloading software became way too much of a minefield for shareware to thrive. It's unfair, but that's what happened. The decline of shareware created an opening for other types of software distribution approaches to thrive, including open-source software, freemium software, and software as a service (SaaS).

And it paved the way for App Store-like approaches, where a gatekeeper plays goalie between you and your digital device, keeping out the extra crap you don't need.

The system isn't perfect, and it's not shareware. But we most certainly wouldn't have gotten to what we have now without the cottage industry that a few creative developers built in the '80s and '90s.

In 1992, the same year that The Software Labs published its immaculate catalog, the Software Publishers Association released a video that was the antithesis of the free-for-all of the world of shareware. "Don't Copy That Floppy" used rap to awkwardly sell kids on the idea that you shouldn't share software.

The video, forgotten in its time, became an eye-rolling classic during an era when millennials wore DARE shirts ironically and couldn't be bothered to tell you what Captain Planet was all about.

Shareware, which had Wolfenstein 3D, Doom, and Commander Keen in its corner, looked a million miles cooler, and it didn't even have to resort to cheesy raps to sell itself.

These days, more than ever, we embrace the underdog software developer. Sites like Product Hunt exist basically to fête the developer who's willing to take a risk on the unknown. It can be challenging to make a living on a big risk, but it's possible.

By throwing out the shrink-wrap and the boxes, shareware created a path for the underdog to thrive. The culture we have around software these days exists because, way back when, we decided to copy that floppy.

And thank God we did, because our software would suck without the underdog.

War, Peace, And Action Figures

$
0
0
War, Peace, And Action Figures

Today in Tedium: The '80s were by and large a time of peace for the United States. The Cold War was thawing, and most of the conflicts the U.S. did get in took place over secured phone lines, rather than on the battlefield. Pop culture was far more likely to arm itself to the teeth during this era: Military-inspired cartoons like G.I. Joe: A Real American Hero ruled the after-school airwaves, while films like Rambo and stars like Chuck Norris and Dolph Lundgren were bringing explosive military action to the cineplex. Not everyone was so happy about this state of affairs, however, and as I highlighted in this Worlds of Wonder piece back in September, toys were often the main targets of this disdain. Today's Tedium discusses the war on war toys. — Ernie @ Tedium

350%

The increase in sales of war toys between 1982 and 1985, according to statistics from the National Coalition on Television Violence. The organization estimated that war toys represent $842 million in toy sales per year, and that four of the five most popular kinds of toys released during the era portrayed some form of violence.

War, Peace, And Action Figures

Why such a dramatic increase in war toy sales? Credit G.I. Joe.

An important pop-culture trend redefined both cartoons and toys starting in the spring of 1982, and that phenomenon was G.I. Joe.

It was great for the toymaker Hasbro, but not so great for anti-war activists.

For those that haven't followed the history of G.I. Joe, here's a two-second primer: In the 1960s, G.I. Joe looked a heck of a lot different from the Real American Hero that is popular today.

The toys of the early G.I. Joe era, in the mid-1960s, were 12 inches tall, and could be realistically posed any which way. But the Vietnam War happened, and suddenly, toys inspired by soldiers seemed like not such a great idea.

A 1970s version, which de-emphasized the war and military concepts, failed to catch fire. But eventually, a new approach, tied to a storyline straight out of the comic books, did the trick.

Working with Marvel Comics, Hasbro came up with an original storyline for the toy line, complete with new characters and a common enemy—one you might know as COBRA. It played more with the military elements, but it wasn't tied to actual U.S. battles. And instead of selling giant dolls, Hasbro could now sell toys that were just 3.75 inches high, thanks to the success of Star Wars. That meant the company could save money and focus on building accessories for this massive universe Marvel helped them create.

The partnership, at first, also allowed for a clever skirting of advertising rules. See, animation couldn't be used to promote toys on television at the time—it could only be used partly, and the ads required pictures of actual kids playing with the toys. As Mental Floss notes, Hasbro and Marvel got around that rule by promoting the comic book on television, a comic book which just happened to use the toys that Marvel was selling.

It was the first ad for a comic book ever aired on television, and it set the stage for a massive cartoon success, one that came about in 1985, after the Reagan administration deregulated the kinds of cartoons that could air on television.

G.I. Joe, with its military themes and tightly integrated marketing strategy, won big. But it created a massive backlash of concerned parents and anti-war activists.

"I believe that for far too long the government, with its programming content guidelines, has indicated what is best for the public. I'd rather eliminate those guidelines and let the marketplace decide more surely what the people want."

— Mark S. Fowler, the head of the Federal Communications Commission during much of the Reagan administration, speaking to The New York Times in 1981. Fowler's let-the-market-decide approach to children's programming had significant effects on the kinds of programs that appeared during this era, which greatly benefited G.I. Joe and other toy-oriented shows, but put a damper on the kind of educational programming that defined the '70s.

War, Peace, And Action Figures

(JWPhotowerks/Flickr)

How the "Ban War Toys" movement gained momentum

By 1985, G.I. Joe and other action-heavy toys were massively popular, but that success was raising some larger concerns for parents, educators, and peaceniks.

And during that holiday season, the movement against war toys started picking up steam in the U.S., through a variety of advocacy groups and other organizations. One such group, the International War Toys Boycott, held mock funerals for military toys, complete with eulogizers such as Vietnam vet Max Inglett, who became an anti-war protester after returning home, going so far as to hitchhike across the country in a wheelchair to draw attention to his crusade.

"Since childhood, we have been conditioned by being told it is fun to play war," he told the Los Angeles Times. "I had numerous conversations in Vietnam about the fact that we are conditioned by war toys to think it's OK to kill in battle. I think we need to learn at a very early age that war is not a game."

Author and peace activist Deb Ellis, in an article for Peace magazine, suggested that the Iran hostage crisis created the environment that allowed war-friendly toys to thrive once again.

"Something as ugly as war needs to be beautified before it hits the market. Memories of American boys being slaughtered on national TV fade into the background when Rambo is around," she wrote. "America is seen as taking control of violence again, rather than as being a victim of it. Being kicked in the head is not glamorous, but doing the kicking is glamorous—and even virtuous."

Even celebrities got in on the movement. Michelle Phillips, the singer for the Mamas and the Papas, was inspired to join the Ban War Toys movement after seeing that a friend's 6-year-old son had been given toy machine guns for his birthday.

Perhaps the most prominent voice throughout this whole movement might have been a psychiatrist named Thomas Radecki. Having chaired the National Coalition on Television Violence throughout the '80s, Radecki was often the key advocate against violent imagery in the media and a commonly quoted figure. His views could be a bit out there at times—specifically, when criticizing Dungeons & Dragons, a game he claimed was behind a number of deaths. (And just watch him tell Larry King, straight-faced, in this 1989 clip, that Archie Comics are just as good an option as anything in the Marvel Comics stead.)

But he nonetheless gave weight to advocates for banning war toys, at one point finding, through his own research, that playing with He-Man toys often created more antisocial behavior among preschoolers than playing with Cabbage Patch dolls. The study was criticized due to its small sample size, but Radecki's voice gave it weight.

"The evidence is quite strong that we are transmitting an unhealthy message encouraging children to have fun pretending to murder each other," Radecki said, according to the New York Times.

(These days, Radecki has fallen pretty hard from grace: He is currently serving a prison term after repeatedly over-prescribing opioids to patients, sometimes in exchange for sex.)

The toy industry, understandably, was skeptical of the criticism it was facing from some of these critics.

"Imaginary play in no way makes ax murderers," stated Donna Datre, a representative of the Toy Manufacturers of America, in comments to Mother Jones in 1986.

On that front, point to Datre. Lots of He-Man and G.I. Joe toys were sold throughout the '80s. Few of them created any ax murderers.

1980

The year that Sweden implemented measures that discouraged toy producers and retailers in the country from selling war toys. The effort came about in the late '70s, the result of a few years of debate within the country. In the midst of the growing discussion of the issue internationally in 1986, the academic journal Prospects - Quarterly Review of Education published a report by a Swedish National Board of Education official that highlighted the progress the nation had made with its restrictions. Other nearby countries, such as Finland, eventually followed suit.

What's fascinating to me about the war on war toys that dotted the Reagan era is the fact that it appeared to come from a place of liberal conscience, which is unlike most movements of its type.

Clearly, however, it ran out of gas in the U.S.: G.I. Joe is still on the market, but you don't really hear about protesters holding mock funerals for war toys. (Regulations did, however, kill cartoons on most of the networks on which G.I. Joe originally ran.)

More successful were efforts to do the same thing to video games in the early '90s. As Joe Lieberman will tell you, game companies eventually adopted a ratings system based on the former senator's pressure on the issue.

Every once in a while, someone still makes the argument that war toys need to be thrown out. One such example of this comes from Tom Turnipseed, who wrote in 2009 that Inez Tenenbaum, who served as head of the Consumer Product Safety Commission for much Barack Obama's presidency, should make it a priority to ban war toys, for safety reasons.

"Inez's top priority should be banning war toys," Turnipseed wrote on Common Dreams. "War toys are products threatening the safety of people everywhere with or without lead paint."

Such a ban, despite concerns about our country's mass shootings, seem unlikely. But if we ever did decide to do so, we'd have some interesting company on that front. Last year, Afghanistan—a country at the center of American wartime efforts for more than a decade—banned toy guns, with the goal of stemming violence.

According to AFP, Interior Minister Noor-ul Haq Uloomi cited "physical and psychological damage" as a reason for such a ban.

John Rambo might not know how to compute that news.


News In Small Bytes

$
0
0
News In Small Bytes

Today in Tedium: Publishing words on a computer is not a difficult task these days. We have so many options to write words and distribute them through a computerized interface that we take the whole process for granted. (You probably published something in the time it took you to read the previous two sentences.) But there was a time when publishing through a computer was painful, cumbersome, and expensive. It also had little in common with the free-for-all that allows for the creation of all sorts of terrible/fascinating content online by random folks. Today, Tedium tracks all the scraps of journalism and tells a story about the earliest days of online news, when the modems used acoustic couplers and it cost a bajillion dollars a minute to read a news story on a computer. — Ernie @ Tedium

"We believe that this project will lead to enhancements in the content of the magazine. It will keep us in closer touch with the interests of our readers and will help us learn the best ways to use electronic communications in conjunction with our traditional print publication. We'll let you know about new developments in this area as soon as we can."

— A message in BYTE Magazine's October 1984 issue, announcing the launch of BYTEnet, which eventually became the BYTE Information eXchange (BIX). The network may have been the most ambitious online effort by a media outlet in the days before AOL and Prodigy made it easy. The magazine's writers, including "Computing at Chaos Manor" columnist Jerry Pournelle, visited BIX often, sometimes contributing content unique to the network. (Side note: Jerry Pournelle, who is also a famous science fiction author, is pretty much the most awesome human being who ever lived. He's 83 years old and he's been blogging pretty much since they invented wire.)

News In Small Bytes

An example of a Ceefax system. (Sarah Joy/Flickr)

Five notable developments in the history of publishing stuff on computers

  1. In 1969, The New York Times released its first online article database, which it called the New York Times Information Bank, or InfoBank. The tool offered access to stories from both the complete Times collection and selected archive materials from 60 other publications. "The Information Bank has been developed with the end user specifically in mind; every effort has been made to bridge the gap between the world of automated information systems and the student, business executive, government official or other information seeker," wrote Sally Bachelder, the marketing representative of the database product, in an academic journal article. If you ever wondered by the Times' online database was so much better than every other newspaper's, this is why.
  2. In 1970, a computer terminal in Columbia, South Carolina, sent an Associated Press story to another terminal in Atlanta. This, according to Poynter, was the first time a computer terminal was used to write, edit, and publish a story in full.
  3. In 1974, the British Broadcasting Service started Ceefax, the first teletext-based information service to go into wide use. This service, which was shown on television during the dead of night and accessible on British TV sets sets, proved popular enough that competing television networks created their own versions of Ceefax. It was in constant operation for nearly 40 years, until the shutoff of BBC's analog television signal in 2012. Here was the final broadcast.
  4. Also in 1974, Dow Jones launched an online information service, built around its Wall Street Journal, that was designed specifically for investors, rather than researchers. The service, accordingly, didn't have a lot of archival material, and instead was designed to offer up-to-the-minute financial news—news that cost a pretty penny and was sold on a contract basis to investors.
  5. In 1993, AOL released its RAINMAN markup language for building screens and content in its proprietary interface. This tool, created by two of AOL's cofounders, effectively was one of the first content management systems that went into wide use—in other words, the WordPress of the '90s. Not that a lot of people knew about it way back when—heavily moderated by non-disclosure agreements, it only really became publicly known after a community of AOL hackers came about.

"We do think that it provides us with an excellent means to supplement the daily newspaper, by providing readers with updated information, after they receive their paper, throughout the day, so that any time during the day that a reader wants the latest story, the latest bit of information on a particular item, all they have to do is dial into the CompuServe service."

— Bob Johnson, vice president of the Columbus Dispatch, discussing (in a 1980 interview) why he felt the online version of the Dispatch, which was being testing on CompuServe that year, wouldn't compete with the print product. The Dispatch was the first newspaper to go online—partly a hometown-pride play on the part of the Columbus, Ohio-based CompuServe—and it was far from the last.

News In Small Bytes

How CompuServe convinced the country's largest newspapers to put their stories on its service

The Columbus Dispatch's early experiments with CompuServe were no accident.

They came about in part to some savvy discussions on the part of CompuServe's cofounder and early CEO, Jeff Wilkins.

The firm, which got its start in 1969 as a computer time-sharing firm targeted at large companies, slowly evolved into a consumer-targeted service that offered different kinds of online software to end users. One of those software ideas was searchable news, something that Wilkins championed as a larger concept.

In an interview for the podcast Conquering Columbus Wilkins noted that the firm wanted to get the Associated Press, which had previously only been available to the public through dead trees, onto its service. Wilkins first approached the Dispatch, talking them into giving the online service a test feed to try things out.

Eventually, Wilkins reached out to the Associated Press directly, traveling to New York to show the concept to AP officials, who then brought the idea in front of the board of the American Newspaper Publishers Association (ANPA). The board, which got a gander of the concept at a meeting in Hawaii, proved so intrigued by what they were shown that members of the board, including Washington Post publisher Katherine Graham and New York Times publisher Arthur Sulzberger, decided to made a special trip to Columbus to meet with CompuServe in person.

Wilkins, who had initially wanted to pay for a direct AP feed, managed to flip the script: He asked for the association to let him have access to the feeds of 10 large papers for a test, in exchange for $250,000 in free advertising from each paper. ANPA not only agreed to that deal, but said it would be willing to get each of its member papers on board as part of an ongoing test of the service—one that, in the end, would last approximately two years.

"So, we ended up with every big newspaper in the United States," he said in the podcast.

It may have been the greatest negotiation in business history. Put in a room with the most powerful publishers in the newspaper industry, CompuServe's executives managed to convince the AP's members to give away their copy for free—all by portraying it as an experiment.

"Oh, it was beyond our wildest dreams, all I could do to stop from keep from jumping up and down," he added.

Alas, the idea proved more successful in theory than in execution. According to the 2008 book, On the Way to the Web: The Secret History of the Internet and Its Founders, the experiment ended with something of a thud, with just 10 percent of CompuServe's users regularly using the platform and outside beta testers complaining about the lack of photos. (And those users weren't even paying $5 an hour!)

Additionally, questions quickly arose about whether services like CompuServe would endanger the golden goose:

The test brought out quite a bit of data about online demographics and reading habits. Most users were males in their thirties, with decent incomes. They logged on for email, news, shopping, gaming, banking, and so on. But newspaper publishers weren't as interested in these things as they were in assurances that online services were no threat to newspapers and their advertising base. They got that assurance from AP spokesman Larry Blasko, who told them, "There is no danger to the American newspaper industry from electronic delivery of information to the home."

Newspapers had survived radio. They had survived television. According to ANPA, they would survive computer networks.

According to the passage of time and highlighted by the recent decline in print advertising revenue at the New York Times, that may have been a bad bet.

$50M

The amount that the newspaper company Knight-Ridder spent on Viewtron over its three-year lifespan. The initiative, which was an effort to bring a Ceefax-style videotex system to the United States, failed pretty hard, garnering just 20,000 subscribers at a time when other online networks were seeing massive growth, and failing to go national. Its model was a clear inspiration for AOL and Prodigy, however. Oh, and WebTV.

As we all know about the modern internet, the news industry's relationship with the series of tubes is mutually beneficial. Sure, the news wires throw out a bunch of stuff on the internet, but the internet often is a great source for news fodder.

And back in 1983, CompuServe gave back to the news wires in a big way. One of the service's earliest prominent users, Christopher Dunn (or CHRISDOS), had met the love of his life, Pamela Jensen (or Zebra3), through the service's chat lines, known at the time as CB Simulator.

Eventually, their story got picked up by the Chicago Tribune, which created a bit of a media sensation. The public was introduced by CHRISDOS and Zebra3 to the idea of online dating, and the couple ended up getting married, even showing up on Donahue at one point.

"If it weren't for the way we met, I think we could just be any other 25-years-married couple," Dunn told the Tribune in a 2008 retrospective story. "I've always adored her. She adores me. It's very easy to love my wife, I guess that's just the way I am."

It was just the first of many viral news stories that the online news machine would swallow up and turn into content. But it may be perhaps one of the most heartwarming.

Anything But Politics

$
0
0
Anything But Politics

Today in Tedium: If you’re in the U.S.—and perhaps, even if you’re not—every other message you’re getting right now is probably about this dang 2016 presidential election that’s coming to a close as we speak. Much fighting has been had. Many stories have been written. As a newsletter, Tedium exists to talk about dull, unusual things that basically aren’t, well … the 2016 election. Your brain most certainly needs a break from the electoral pummeling it’s going to be taking in the next few hours as Wolf Blitzer and Chris Matthews do the opposite of soothing your frayed nerves. This issue is being published a little early today, to offer just that respite. Consider it therapy by Tedium. — Ernie @ Tedium

“The harmonic intervals—or gaps between notes—have been chosen to create a feeling of euphoria and comfort. And there is no repeating melody, which allows your brain to completely switch off because you are no longer trying to predict what is coming next.”

— Lyz Cooper, the founder of the British Academy of Sound Therapy, discussing the process the band Marconi Union used to create what’s referred to as the most relaxing song ever, “Weightless.” The song is eight minutes long, and was created with the help of scientific theory. So if your brain is on the verge of explosion, listen to this. Or take ibuprofen. Your pick.

Anything But Politics

Doug Funnie, looking relaxed, writing in his journal. This is how Tedium is produced twice a week. (Nickelodeon)

20 true facts that don’t have anything to do with each other besides the fact they’re not about American politics

  1. Back in the ‘90s, one company had the bright idea of combining a Super NES with an exercise bike—a narrow platform for which just a few games were made. The resulting device is incredibly rare, as are the games made for it. When one of the bikes showed up on eBay in 2013, the bid price was $10,000.
  2. It took roughly 50 years after the invention of the can for someone to come up with a can opener. Before that, the recommended opening method involved a chisel and hammer.
  3. In 1984, Paul McCartney scored a sizable U.K. hit with a song he wrote for the cartoon Rupert. The cartoon aired before the movie Give My Regards to Broad Street, which was a movie he released that year that he really shouldn’t have. (Also of note: The film inspired a Commodore 64 game of the same name.)
  4. In Iceland, Cool Ranch Doritos are called “Cool American” Doritos, because nobody knows what ranch dressing is outside of the United States.
  5. Unlike Apple, Rolex is a 100 percent vertically integrated company, which means that every part of a Rolex watch is produced inside of a Rolex factory. No outsourcing in Switzerland.
  6. For five and a half years, workers in the former Czechoslovakia spent tons of energy building a gigantic statue of Joseph Stalin out of granite in Prague, which was heavily influenced by the Soviet Union at the time. Soon after the statue was built, however, the Soviet government decided to de-emphasize Stalin’s legacy. As a result, the statue was destroyed just seven years after it was revealed.
  7. The company that created the Trapper Keeper is the same one that founded LexisNexis.
  8. In 1979, you could buy a backyard satellite dish from a Neiman Marcus catalog for $36,000.
  9. India Pale Ale gets its name not from being based in India, but the fact that it was designed to make the long trip from England to India, a British colony in the 19th century, without going bad.
  10. In 1983, the corporate rock band Journey starred in its own arcade game, which is notable because developers used an early digital camera to get the faces of the band members in the awful, awful game.
  11. A law recently passed in California allows actors to remove their ages from IMDB pages, as a way to prevent discrimination by casting directors.
  12. In the film version of The Wizard of Oz, Dorothy and the other characters are shown being covered in snow during one scene. That snow is made of asbestos, which is probably the most carcinogenic thing one can do.
  13. The first closed-captioned message, created by Texas Instruments in the ‘70s at the behest of PBS, was “Float like a butterfly, sting like a bee.”
  14. Spoiler alert: The creator of Doug failed to win over his real-life Patti Mayonnaise. No word on whether he satisfyingly decked the real-life Roger Klotz.
  15. Until earlier this year, Bill Gates owned millions of classic photographs. He sold those images—including the iconic Tiananmen Square “Tank Man” image—to a Chinese company. Most of these photos, by the way, are stored in a limestone mine in Pennsylvania. Here’s a documentary about the mine.
  16. It’s nearly impossible in a football game for a team to end the game with just four points. In fact, it was a big deal in November 2011 when the St. Louis Rams scored four points in a single quarter. (Nearly as unlikely is the 11-10 game, but it’s happened.) That said, despite the number of games played, there are a number of final scores that have never been reached in professional play. Pro Football Reference has the full list.
  17. According to an interview with Pras of the Fugees from earlier this year, the band’s first album initially sold just 12 copies. Their second album did a little bit better.
  18. Keanu Reeves had his signature forged on a contract by a friend of his—forcing him to make a terrible movie he never wanted to make, simply so he could avoid a lawsuit.
  19. The woman who invented the dishwasher is the granddaughter of the man who invented the steamboat.
  20. And finally … Steve Jobs introduced the world to Wi-Fi with a hula hoop.

Finally, just to offer you some thoughts that aren’t related to politics nor anything in particular: Some people might wonder why this newsletter hits your inbox really late at night—sometimes around 3 a.m. or so in the morning Eastern time.

I don’t think there’s any rhyme or reason to it, other than that, perhaps, I’ve always been a night owl. Sometimes, being up really late with nothing else to distract you, is really great for focus and attention.

Sometimes, I’ll nod off while I’m writing, maybe for 10 or 15 minutes, maybe a little longer, but I always for some reason wake back up and finish the piece, committed to writing about how they brine pickles or whatever other inane topic that I’ve chosen to write about that night.

I like to think that someone might read this newsletter in the middle of the night, unable to zonk themselves out, and my rant about whatever weird topic is on my mind that night does the trick.

When it comes down to it, the stresses of life can feel almost crushing sometimes. But writing Tedium, no matter how long it takes me, is a great way for me to stop worrying about how stressful the world or the rest of the news cycle gets.

Tedium, in its own way, is a great way to keep your own mental health in check. Hope it comes in handy tonight.

Junk Food’s Happiest Accident

$
0
0
Junk Food’s Happiest Accident

Today in Tedium: They change the color of our skin. They get stuck in our teeth. But for some reason, we can’t stop eating cheese curls, the puffiest snack food ever created. But these corn-and-powder snacks didn’t just fall like manna from the sky into our bowls, always there for us ahead of our Bojack Horseman marathon. The story of the cheese curl is one of the more unusual creation stories in snack-food history. Let’s talk about it. It’s weirder than you’d think. — Ernie @ Tedium

Junk Food’s Happiest Accident

(via Etsy)

Who invented the cheese curl? One story involves a piece of agricultural equipment

Wisconsin, the agricultural hub that it is, has given us a lot of food innovations over the years. (Three words: fried cheese curds.)

But some of those innovations, like the process that gave us the modern cheese curl, were complete accidents.

But the accident proved fruitful for Flakall Corporation, a Beloit, Wisconsin animal feed manufacturer whose owners later switched gears to producing snack foods, all thanks to the way the company cleaned its machines. The company’s approach to producing animal feed was to put the material through a grinder, effectively flaking out the corn so more of it could be used, and ensuring cows weren’t chewing any sharp kernels, as well as to get as much usable material as possible from the grain.

Junk Food’s Happiest Accident

A feed grinder. (patent filing)

“This flaking of the feed is of advantage because it avoids loss of a good percentage of material which otherwise is thrown off as dust, and gives a material which keeps better in storage by reason of the voids left between the flakes, such that there can be proper aeration, not to mention the important fact that flaked feed is more palatable and easily digested by the animal,” the firm stated in a 1932 patent filing.

The grinder did its job, but it wasn’t perfect, and periodically required cleaning to ensure it wouldn’t clog. One strategy that Flakall workers used was to put moistened corn into the grinder. During this process, however, something unusual happened: the moist corn ran directly into the heat of the machine, and when it exited the grinder, it didn’t flake out anymore—it puffed up, like popcorn, except without the annoying kernels.

By complete accident, Flakall had invented the world’s first corn snack extruder.

Edward Wilson, an observant Flakall employee, saw these puffs come out of the machine, and decided to take those puffs home, season them up, and turn them into an edible snack for humans—a snack he called Korn Kurls.

Another way to put this is that when you’re eating a cheese curl, you’re noshing on repurposed animal feed.

Junk Food’s Happiest Accident

How a cheese curl is made. (patent filing)

This state of affairs led to the second patent in Flakall’s history, a 1939 filing titled “Process for preparing food products.” A key line from the patent:

The device preferably is designed so as to be self-heated by friction between the particles of the material and between the particles and the surfaces of contacting metal and to progressively build up pressure during the heating period. Thus the uncooked raw material, having a predetermined moisture content is processed into a somewhat viscous liquid having a temperature high enough to cook the mass and heat the water particles to a temperature high enough for evaporation at atmospheric pressure but being under sufficient pressure to prevent it.

If that’s a little complicated to understand, a 2012 clip from BBC’s Food Factory does the trick.

In the video, host Stefan Gates takes an extruder and connects it to a tractor, making the extruder move so fast that it puffs the corn out in an extremely fast, extremely dramatic way.

Clearly, Flakall had something big. The firm eventually changed its name to Adams Corporation, which helped take some attention off the fact that it was selling a food product to humans that was originally intended for cows.

Junk Food’s Happiest Accident

The other claimant to the cheese curl’s invention bounced back from natural disaster

While Flakall has the more interesting tale on this front, it’s not the only one. Another early claimant to the cheese curl is a Louisiana firm called the Elmer Candy Corporation, which developed a product eventually called Chee Wees.

The Big Cheese of New Orleans, as it’s nicknamed, became a local institution. (Unlike Zapp’s, another local snack-food icon whose excellent Voodoo chips have a cult following, Chee Wees haven’t gone national. Perhaps they should.)

Elmer’s Fine Foods—no longer a candy company—is a family-owned business that’s produced cheese curls mostly continuously for roughly 80 years.

I say “mostly,” of course, because the firm had to deal with the impact of Hurricane Katrina. As the company explains on its website, Elmer’s entirely facility was flooded out by the deadly storm, and the company had to stop operation for 16 months while it recovered from the hurricane and completely replaced the machines that produced the snacks.

A challenge like that might have been enough to kill a lot of companies. But Elmer’s bounced back—and it’s still active to this day.

(Another notable cheese curl firm, Old London Foods, came out with its variation, the Cheese Doodle, in the late 1950s.)

Junk Food’s Happiest Accident

(Mike Mozart/Flickr)

Five interesting facts about Cheetos, the brand that took cheese curls mainstream

  1. While Cheetos came along later than its competitors, first being invented in 1948, it quickly overtook the market, in part because it had gained national distribution due to the prior success of Fritos. That company’s founder, Elmer Doolan, worked out a deal with H.W. Lay and Company to market Cheetos to the broader market. It quickly became a massive hit.
  2. Cheetos are by far the most popular brand of cheese curls in the United States: According to Statista, the Cheetos brand had an estimated $969.5 million in sales in 2016, with the next most popular brand being Frito-Lay’s more-upscale Chester’s brand, which garnered up just 7 percent of Cheetos’ total sales.
  3. The success of Cheetos was so impressive that it played a large role in the merger of Frito with Lay in 1961, as well as the company’s later merger with PepsiCo in just four years later.
  4. There are two main varieties of Cheetos—crunchy, the most common kind, and puffed, which only came about in 1971 or so. Each is made through different variations on the corn snack extruder process. Dozens of other flavors exist, however, both inside and outside of the U.S.
  5. The reason that Flamin’ Hot Cheetos have such a prominent color that sticks to everything (and turns your fingers red), according to Wired, has a lot to do with the product’s use of food dyes that have an added chemical to make the seasoning oil-­dispersible. That’s because the powder won’t stick to the Cheetos without vegetable oils.

Junk Food’s Happiest Accident

(Mike Mozart/Flickr)

Why we never got a Chester Cheetah Saturday morning cartoon, despite multiple attempts

These days, Chester Cheetah is trying to goad Beyoncé on Twitter just like every other advertising mascot worth its weight in salt, but there was a time that the cheetah was seen as so impressive that there was chatter it could become a cartoon lynchpin.

In fact, Frito-Lay got pretty far down the road with Fox in turning the mascot, launched in 1986, into a cartoon. Yo! It's the Chester Cheetah Show, as the toon would have been called, was developed as a potential part of Fox’s Saturday morning cartoon slate. (CBS also considered making the show, but rejected it.)

Problem was, advocacy groups were not happy with the idea for the show, because of the fact that its roots were very clearly as advertising. Action for Children's Television (ACT) and the Center for Science in the Public Interest (CSPI) were among the groups that petitioned the FCC regarding the program.

"His only previous television appearances, indeed his entire existence, have been in traditional commercial spots designed to sell a product,” the FCC petition stated, according to the New York Times.

ACT noted that it was rare to petition the FCC about a cartoon only in the planning stages, but felt it had to speak up due to what it felt was the unprecedented nature of the idea.

It didn’t help that Kraft was trying to sell Cheesasaurus Rex, its macaroni and cheese mascot, as a TV show around the same time.

Just a few weeks after the controversy blew up, Frito Lay and Fox shelved the idea, with Fox claiming that differences in creative control and long negotiations killed the show—not protests.

“I still believe he’s one of the best characters since Bugs Bunny, and the fact he is associated with a product was irrelevant to us,” Fox Kids President Margaret Loesch told the Times.

Frito Lay spokesman Tod MacKenzie, in comments to the Associated Press, was a bit more honest.

“Since Chester came out in 1986, he’s been wildly popular,” MacKenzie told the AP. “We don’t want to jeopardize the job he’s doing here.”

The book Saturday Morning Censors: Television Regulation Before the V-chip, published in 1998 by Duke University Press, pointed out that while ACT didn’t officially win the case with the FCC—which, during the Bush era, was in no mood to censor a show that wasn’t indecent—the group acted like it did, especially after its efforts also killed off the Cheesasaurus Rex show.

"We feel we have zapped, for the time being, the problem of logos turning into half-hour programs," ACT President Peggy Charren stated at the end of the controversy.

But you still have to wonder what might have happened if the show got the green light after all this. ( This wiki page certainly does.) Would we be eating fewer cheese curls? Maybe more?

Clearly, the cheese curl has come a long way from the days when it was a happy agricultural accident.

But one firm that didn’t see the success of the Cheeto is the firm that was Flakall. Well, at least not directly. Its successor organization is a firm called Maddox Metal Works, which literally makes machines designed to manufacture corn-based foods. Here’s the one for cheese curls (Flash required).

The company acquired Flakall’s successor company, Adams International, in 1993. It was an acquisition that makes a lot of sense. See, in the 1950s, Maddox Machine Shop worked directly with the Frito company to build the machinery used to produce the company’s snack foods, and it grew from there.

In Beloit, Wisconsin, there is a Frito-Lay factory—a big one, one that the company has invested millions of dollars into. One that makes extruded snacks.

I’d like to think that a Frito-Lay executive put the factory there because they wanted this story to come full circle.

Lessons From The Video Professor

$
0
0
Lessons From The Video Professor

Today in Tedium: Home computers and infomercials were always meant to go together. Here were these devices, arguably the most complex products ever to enter homes in Middle America, and they needed a lot of explaining at first. (Not complicated explanations, mind you. Simple ones.) Enter Video Professor. For two decades, John Scherer made millions by selling educational products to the public, mostly through late-night infomercials and on home-shopping networks like QVC. Video Professor the company isn’t with us anymore—these days, you’re more likely to get a DNS error from VideoProfessor.com than a lesson—but in today’s Tedium, I talk to John about the business that made him famous and what he’s doing these days. — Ernie @ Tedium

Editor's note: This piece originally published on Vice's Motherboard, a syndication partner of ours. It's a pretty rad site; they might be onto something.

$100M

The amount in revenue Video Professor made during its peak years—around 2005 and 2006—according to a Denver Post profile of the company. The Colorado-based firm, launched in 1987, struggled at first, with Scherer admitting to the Post that he struggled to cover the bills for the first few years of its existence. Things changed in 1990, when he started doing infomercials—something that quickly drove the company’s success.

Lessons From The Video Professor

John Scherer's well-known catchphrase: "Try my product." (YouTube screenshot)

The story of how Video Professor became such a big deal

“I used to tell everyone if I just had a room full of people to show how the product works everybody would buy it. The problem was I could never figure out where that room full of people was located.”

John Scherer didn’t see himself becoming a television icon. TV was simply the most effective way to sell his product—and it turned out, the giant room full of people interested in Video Professor’s lessons was actually millions of smaller rooms, each with a television set or computer sitting nearby.

The concept was born from Scherer’s first business, a Colorado PC clone-maker called Data Link Research Services. The company was successful, but it had some sluggish periods. Amidst an effort to sell machines to an investment firm, he had the idea of giving the firm video instruction kits to explain how to use the computers as well as the investment software installed on the machines. In the process, he discovered something surprising.

“I told an employee to go to a library or store and get a video on how to use the computer so we could include that as well and we discovered that there was nothing like that available,“ he said. “The rest, as they say, is history.”

In a 1987 interview with InfoWorld, Scherer noted one of the advantages of the Video Professor tapes was the ability for users to study at their own pace. Targeted at novices, the company’s products were intended as an alternative to complicated manuals or large, classroom-style training sessions.

While the Colorado-based firm's products were built with the business market in mind—during a period when computer knowledge didn’t come naturally to users—it proved just as useful for home users.

Infomercials helped to drive the product to a larger user base, and at the same time, they made Scherer—who, as Jimmy Kimmel once hilariously pointed out, looks like John McCain with a mustache—a recognizable figure to millions of people.

“It didn’t matter if I was in New York City or California—people would stop me when I was out for dinner or just walking down the street, and when Jay Leno talked about me on his show, I knew I was becoming well known,” he said.

That came with its benefits—it boosted the company’s sales, and before an interview, Fox and Friends introduced him as one of the most recognizable people in the country—but it also put extra scrutiny on the firm.

It cut both ways.

Lessons From The Video Professor

Video Professor disc set (via Amazon)

Five interesting facts about Video Professor’s lessons

  1. Initially, the firm sold VHS tapes, but moved to CD-ROMs as that format became more mainstream. “When we changed from videos to CDs our revenue shot sky high,” Scherer said. The internet also juiced sales, but complicated the business model, as the firm sold different products online than it did on television.
  2. The lessons for different Video Professor courses were put together by actual professors from universities, who wrote the scripts used in the videos. “We knew if we could understand it, then we knew the consumer could,” he noted. The videos were then split up into three different levels.
  3. The company never really aimed for the bleeding edge, and the content production model proved it. Generally, Video Professor titles came out within six months of the release of a given app. Just once did the company make an exception to this rule, upon the release of a version of Windows, at the behest of QVC. (That didn’t work so well, Scherer says.)
  4. The company made most of its money off of just a few of the courses in its library—with many of its most popular lessons being for Microsoft programs—though it had a much larger library that included apps like Photoshop and Quicken.
  5. The most popular Video Professor lesson of all time was, surprisingly, for eBay. The success was such that the company made six-level courses for that particular video. Not that Scherer initially believed in the product. “Very honestly, I was against making the eBay lesson at first,” he said, “so against it that when I was shooting the commercial, I broke the disc that I was holding up because I was angry we were doing something like this. I guess that shows, what did I know?”

Lessons From The Video Professor

An illustration of the Video Professor, as shown in a Windows 3.1 instructional tape. Somewhat ironically, this particular video was created with an Amiga. (YouTube)

Not everyone was a fan of Video Professor’s business model

Video Professor and its many ads made Scherer well-known, but it also drew attention to the company’s business practices—and consumer advocates didn’t like what they saw.

The model, similar to the Columbia House payment structure with the additional wrinkle of automatic credit card payments, drew a number of complaints from consumers, in part because of the large amounts of money it charged for courses in its installment plan. Unlike an unwanted Columbia House CD, a single Video Professor tape or CD could cost hundreds of dollars.

In the last few years of the company’s life, for example, it became a frequent target of discussion at Consumer Affairs, which tested the company’s model after receiving consumer complaints. (The publication’s Joseph Enoch dealt with a bit of a rigmarole during the ordering process according to a 2007 feature, but found that the return process ultimately worked as advertised.)

Sometimes, the approach meant that the firm would find itself in battles with some of the biggest names in tech media. In a 2009 TechCrunch post, site founder Michael Arrington raised a number of concerns with the company’s business model, and drew attention to the fact that the firm had tried to push The Washington Post to modify another post that mentioned the company. (The Post declined.)

“Video Professor is unlike mobile scams which look to get a relatively small $10–$20/month subscription on your mobile bill and hope you never notice,” Arrington wrote. “They go for the big kill: $190-$290 charged to your credit card on time.”

Scherer was quick to defend the company’s practices, stating that the firm had an internal refund policy in which it agreed to refund a user’s money no matter what.

“Having that type of policy and then having people write a criticism that we would not refund them would upset me greatly as I knew they were being dishonest,” he explained to me. “Not only because Video Professor was my company but also because I was the face of Video Professor, so when Video Professor was attacked by critics it was an attack on me personally.“

Those attacks at times led the firm to sue some of its critics, most notably in 2007 when the firm filed lawsuits against 100 anonymous commenters who wrote negative reviews of the company. The issue drew the attention of Public Citizen, which defended some of some of the critics and the sites that hosted their complaints. Eventually, Video Professor backed down, though Scherer—who claims that less than one-half of one percent of customers ever complained—argues that some of the actual targets of the company’s legal strategy were competitors acting in bad faith.

“We had a system in place that proved they were telling untruths and often the criticism was coming from competitors bashing us for their own gain. We found some of those competitors, filed suit and won,” he said.

Scherer blamed the negative press during this era on the firm’s large size and public profile.

“I also learned that the bigger you get in business the bigger target you become,” he said.

“We tried changing the business model from $179, or whatever the price was at that time, to an online model that was $9.95 per month, and it didn’t work.”

— Scherer, discussing one of the contributing factors to the decline of Video Professor as a company and its eventual demise. “In switching over we had to kill the CD sales or it would be too confusing for people,” he added. “When we did that, we killed ourselves, since it was a very complicated switch that didn’t catch on with most people.” (The switch, it should be noted, came at a time when the company was already struggling.) The firm, he noted, also struggled with a significant amount of affiliate link fraud, which helped to dampen the firm’s success.

These days, the Video Professor has found a new calling, one with fewer pitfalls in its business model than his previous one.

In 2012, Scherer started selling a replacement for compressed air, a product he characterizes as dangerous and unsafe to use. The product, as MarketWatch notes, was brought to him by an inventor as a potential product he could pitch, but he chose to buy the company instead.

“The fact that there had been no real alternative to dangerous canned air so that piqued my interest,” he said. “So many people die or are killed as a result of misusing canned air that being able to offer such a great alternative to canned air was very pleasing to me.”

His new company, Canless Air, sells motorized devices that blow air that isn’t compressed, but comes out really fast—in the case of its highest-end product, as fast as 260 miles per hour. Compressed air is commonly associated with cleaning computer equipment, but Scherer insists (as the above clip shows), that the product has a wide variety of other uses.

Scherer doesn’t show up on TV nearly as often these days, as Canless Air is often targeted at corporate users, but his recognizable face has remained a hugely successful marketing tool.

He has also at times played pitchman, running a website in which he offers up his services. But he says that he doesn’t see himself becoming the next Billy Mays—for one thing, he knew the late Oxy-Clean pitchman, and Billy was one of a kind. Scherer, with his slower drawl, says he never got thought he’d become a pitchman for hire.

“The only reason I ever started pitching my product was because I was free and I knew my product better than anyone and I was free,” he said.

If he could do it again, he says he would have done it about the same way—though have sold Video Professor at the height of its success. That’s despite the huge amount of work that went into the company over the years.

“You've got to eat, sleep, drink, live, and breathe your work all day long, even on the weekends, and realize that you're not going to be spending any time at the beach until you're successful.”

A Big Idea, Synthesized

$
0
0
A Big Idea, Synthesized

Today in Tedium: In the late 1970s, a man who had changed the business world by turning massive calculators into handheld devices decided that he wanted to scratch another itch. And with that itch scratched, he introduced a world of creativity to bedroom warriors around the country—a set of training wheels to the musically inclined. The devices he created proudly ignored their weaknesses and succeeded despite themselves. And they even inspired pop culture in ways big and small. Today in Tedium, we’re talking about Casio and the tinny electronic music revolution it fostered. You don’t need a backing band; you have a keyboard. — Ernie @ Tedium

“The guitar enjoyed a very, very long run. But this is something new. Mobility has made these keyboards an entirely new product.”

— Robert Larsen, the national sales manager of Casio’s electronic musical instrument division, telling the New York Times that electronic synthesizers were where it’s at. He had good reason to be bullish. In 1983 estimates from the American Music Conference, keyboards like Casio’s, which sold for around $200 a piece, represented $150 million in sales that year, compared to $93 million for all fretted instruments. Not bad, considering electronic keyboards weren’t even a $10 million business in 1980. Guitars are still with us, however.

A Big Idea, Synthesized

(Daniel Oines/Flickr)

How Casio turned its calculator conquests into synthesizer success

In 1979, a calculator came out that seemed to serve multiple masters. It was highly functional, but it served as something of a bridge between Casio’s earliest success stories and the devices that would define the company in the 1980s.

This device, the Casio Melody-80, could do math, clearly, as it was a calculator, but it was also able to work as a stopwatch or an alarm clock. But the real trick, the one that makes it such an impressive artifact of 1979, is that it was able to work as a musical instrument as well—a device that had pre-programmed classical music that could be easily played, as well as the ability to work as something of a synthesizer.

“As you must have figured out by now, the Melody-80 doesn’t confine its genius to strictly serious business,” a Sharper Image ad in an October 1979 edition of Popular Science proclaims. “By flicking the upper right-hand switch, you covert its calculator buttons into music keys, a full 11 note scale from A to D.”

People didn’t need Beethoven with their math—certainly not when it sounded like this, and certainly not from The Sharper Image—but the device was a page-turner for Casio and the company’s primary designer, Toshio Kashio.

Kashio was the second-oldest of the Japanese brothers who founded Casio and the man who played a key role in inventing many of the company’s earliest products. In 1978, Toshio led the company into the electronic music market, with an array of devices that would come to set the stage for the music industry in big ways and small.

See, Toshio was an amateur musician, and in many ways, he saw the problem he was trying to solve as one that involved trying to make an entire world of instruments available to amateur musicians. So instead of building a device that played with waveforms like, say, Robert Moog did, Toshio approached electronic music as a way to create a single device that could play dozens of instruments—including a drum beat, guitars, and melodies. (This approach, as it turned out, was way more palatable to mainstream tastes.)

Toshio was an amateur musician, but could be called a professional patent-filer, with hundreds to his name. One such patent, filed in 1985 but started much earlier, describes the thought process behind the strategy Casio used for its earliest keyboards:

In order to obtain an artificial musical sound wave fairly analogous to its original natural musical sound, not only an analogous musical sound is used but also a volume envelope including factors such as wave rises and wave falls must be superposed on the analogous musical sound. However, there have been no proposals to effectively superpose the volume envelope on the sound wave by the digital technology. The conventional superposition of the volume envelope has been made by the analog technology or by using a complex control circuit. Thus, the musical sound wave formation technique by the digital technology, which is well adapted for [large scale integrated circuit] fabrication, has not yet been established in this field.

The technology that Casio came up with for some of its earliest keyboards became known as vowel-consonant synthesis, for the similarities the company’s approach had to human speech.

Beyond the Melody-80, Casio came out with a few more synthesizers that could be considered toys, like the Casio VL-Tone, famed for its central place in Trio’s “Da Da Da.” But in 1980, the company released its wood-encased Casiotone 201, the first musical instrument built around sound banks—a big step forward for the company, and for music in general.

It could play an electric guitar, a banjo, a harpsichord, a trumpet, and even a glockenspiel. Unfortunately for Casio, these melodies almost never lived up to their billing—sounding more like distant cousins of a specific sound than an actual tone.

But that proved incredibly valuable for actual professional musicians, who saw the melodies for what they were—a new palette to paint with.

“The initial stage of this rhythm was like a buck-up. They brought a small Casio keyboard to me and started to play around, but it sounded crazy. It was too fast, no rhythm section, just drum and bass going at 100 miles per hour. So I said: 'I like the sound, but it's not the right tempo for reggae music.' I slowed it down to dancing mode, then we overdubbed some piano and percussion, and that was the beginning of Sleng Teng. I knew it was going to be successful because of the sound of the rhythm, but I didn't know that it would be so much of a big hit.”

— Lloyd "Prince Jammy" James, offering his take on the creation of Wayne Smith’s “Under Mi Sleng Teng,” a song that turned reggae into dancehall overnight, all with the help of the Casiotone MT40, which had a preset that proved well-suited to a complete reinvention of reggae music. ( Here’s a clip of Smith’s co-conspirator, Noel Davey, playing the basic melody of the Sleng Teng.) As Engadget reported last year, the Casiotone melody has found itself in a number of unusual places since then.

A Big Idea, Synthesized

A Casio Rapman. (YouTube screenshot)

As you can tell by the commercials, Casio’s keyboards aimed for Middle America

Casio’s decision to put its energy into sound-bank style keyboards proved fruitful—not only did it make the company a lot of money, but it also gave them a great sales pitch. Here was a device that could let you be your own band.

(Definitely cooler than calculators.)

Once Casio realized it had a massive hit on its hands in the form of the various keyboards it was selling, it marketed them aggressively. One clip advertising the Casiotone MT-100 sells the idea of “The Johnson Five,” a one-man band that can rock it out with a single keyboard:

(Never mind the fact that at the end of the commercial, the clip says multi-track recording was used. Wouldn’t want to ruin the illusion.)

Nearly as good is this clip featuring the Casiotone MT-205 with its optional drum kit attachment.

Clearly, the intended audience for these things was the same audience that was waiting for that awesome A-Ha video to come on again.

Eventually, Casio moved away from the Casiotone series, and tried a variety of different approaches. Casio saw the success Yamaha was having in the professional market and decided that it wanted in—initially having success with the CZ series (likely thanks in no small part to this amazing commercial), but failing to maintain that momentum. Casio eventually started making keyboards for kids that were even simpler than the Casiotones. But eventually, Casio found another hit in the form of the Rapman, which won plaudits from professional reviewers for being fun to use compared to other keyboards of the day, despite the disastrous commercials.

“Listen up: Casio's new Rapman does just about everything a bona fide rap group does,” The New York TimesDulcie Leimbach wrote back in 1992, clearly having never heard a bona fide rap group.

But these kinds of tools—toys, really—hold a sort of intrinsic value culturally, the kind that led Toshio Kashio to this idea in the first place.

A lot of creativity was had from that initial spark.

In mid-2007, I saw a YouTube video that floored me in its absurdity. On the scale of before-they-were-famous appearances, it’s miles above “active Vine user” and closer in its unexpectedness to The Party at Kitty and Stud's.

Now, Dan Deacon is certainly no Sylvester Stallone on the fame scale, but he certainly was a bit of a fish out of water on the day that Savannah, Georgia’s WSAV decided to let him perform a live song, “Ohio,” on the air during the network’s morning show. There were probably a lot of wide-awake people in Savannah that morning.

Sitting on Deacon’s table, surrounded by wave generators and a pile of cables, is perhaps the most powerful Casiotone ever made. The MT-400V, with its numerous effects, including an analog resonance filter, noise generator, and stereo chorus effects, somehow managed to find itself on the table of a serious (if absurdist) musical composer.

“A nice Casiotone is always an endless mammoth of sound that can be fun,” he told Pitchfork in 2007, amid the release of his first commercial album. “Especially the MT-400V, which is the one that I've been using for the longest and that's my favorite piece of gear that I have.”

Deacon wasn’t alone in finding love for these devices, despite their clear deficiencies compared to better pieces of professional gear. Artists way bigger (or, more often, smaller) than Deacon have embraced the Casiotone in ways big and small—whether as an element of a larger composition, or (in the case of the lamented Casiotone for the Painfully Alone) pretty much the whole act.

See this viral clip of Jimmy Fallon, Metallica, and The Roots playing "Enter Sandman" on classroom instruments from just this week? They're keeping time with the help of a Casio VL-Tone.

Circuit benders have broken these devices in ways that push them far past their limits. They’re artifacts, yet they’re still vital.

In 2013, artist Daniel Arsham teamed with Pharrell Williams, who got his start on a Casiotone MT-500. Arsham took the shape of the keyboard and created replicas that were designed to look like they had suffered damage, like an artifact.

“I’ve described it as taking it something from the recent past and projecting it over the current moment into the future,” Arsham told T Magazine. “In some ways you’re erasing the present.”

Finally, someone captured a visual representation of the sound of these things.

Viewing all 987 articles
Browse latest View live