This part cracks me up:
Once, I went for a jog and stopped at the post office to pick up a package they were holding for me. […] Because I didn’t have pockets, I had put [the AirPod] down on the counter.
So, Gruber’s stylish enough of a runner to be wearing shorts or tights that don’t have pockets, but he’s not stylish enough to care about running around with a package in his hand.
I would love to have seen this.
Maria Gallucci, writing at Mashable:
It sounds like an “alternative fact”: Sean Spicer, the White House press secretary, apparently swallows dozens of pieces of gum every day. […]
Contrary to what your mother may have told you, swallowing gum won’t lead to a massive pile of rubber globs building up in your belly for years. Gum will eventually wend its way into the toilet within a day or two. […]
The enzymes in your stomach can break down the carbohydrates, oils and alcohols used in each piece of gum. While gastric acids aren’t strong enough to break down the remaining rubber polymers, your digestive system can still work to push it out.
It’s good to know exactly what’s going on in the internal affairs of Sean Spicer.
From the Github blog:
To make it easier to find the pull requests that need your attention, you can now filter by review status from your repository pull request index.
It was clear that this would be coming as soon as Github introduced review requests. Great to see it happen. Prior to this, when someone requested a review, you got an email notification, but that was it. There wasn’t an easy way to see the requests on Github. As a result, they’d slip through the cracks, and developers would manually remind people that they needed a review.
I practically live on Github, and it’s so great to see the product mature and improve.
From the Github blog:
Whether you’re debugging a regression or trying to understand how some code came to have its current shape, you’ll often want to see what a file looked like before a particular change. With improved blame view, you can easily see how any portion of your file has evolved over time without viewing the file’s full history.
A textbook example of where this comes in handy is when you’re wanting to look at the blame for a particular line or block of code, but the latest commit was an automated refactor. You want to look at what happened one commit prior to that. This improved view makes that possible.
Jennifer Bendery, writing for the Washington Post:
WASHINGTON ― Early Saturday morning, Ekram Seid hopped on a city bus with her sister, Yasmin, and made the trek across town for the Women’s March on Washington. […]
For all the diversity of the crowds, Ekram and Yasmin stood out. Both are tiny, standing well under five feet tall. Both wore hijabs. And both are teenagers. Ekram, 18, is even shorter than her younger sister and has braces. Yasmin, 13, stood quietly by her big sister. But when they spoke, they were far beyond their years. They were clear about what is at stake for Muslim women and other minorities if they don’t engage in politics and stand up for their rights.
If you don’t understand what goes on in countries that are ruled by Muhammadism, you own it to yourself to know. Here’s a bit from Time Magazine:
Where Muslims have afforded women the greatest degree of equality–in Turkey–they have done so by overthrowing Islamic precepts in favor of secular rule. […]
Under Shari’a, or Muslim law, compensation for the murder of a woman is half the going rate for men. In many Muslim countries, these directives are incorporated into contemporary law. For a woman to prove rape in Pakistan, for example, four adult males of “impeccable” character must witness the penetration, in accordance with Shari’a. […]
Fear of poverty keeps many Muslim women locked in bad marriages, as does the prospect of losing their children. […]
Wife beating is so prevalent in the Muslim world that social workers who assist battered women in Egypt, for example, spend much of their time trying to convince victims that their husbands’ violent acts are unacceptable.
I know someone who was deployed as a soldier to Iraq. He told me that the way women were treated was horrible, but since there was nothing you could do about it, you had to see the humorous side and laugh about it, or you’d go crazy.
Muhammadans who complain about Trump being a misogynist have no leg to stand on. Take that towel off your head and renounce your sexist religion and culture before you throw the first stone at Donald Trump.
I’ll go a step further. If you complain about Trump’s misogyny but you give credence to Islam as a viable religion in the United States, then you’re a hypocrite and should be ashamed of yourself.
If you want to understand something, don’t look at it when it’s 1% of a populace. Look at it when it’s the overwhelming majority of the populace. If what you see isn’t pretty, then it’s time you go back to the drawing board with the 1%. I shouldn’t have to tell you that Islam in its true form as we see it in the countries described above is utterly incompatible with Western thought and culture. I also shouldn’t have to prove to you that one is objectively superior to the other. If your insistence that women shouldn’t be allowed to be beaten causes you to be labeled Islamophobic, then so be it. Wear that label with honor.
After that, if you still have any strength left and still want to criticize, then you can go after Trump. Just keep in mind that unlike Muhammadans, Trump doesn’t shoot women who ask for an education. He’s not even in the same ballpark, so maybe go a little bit more lightly on him?
If you’re in pain, then President Donald Trump’s inauguration speech today was music to your ears. He told you that the status quo is flawed and promised to fix it. If you’re in pain, you needed to hear that. Pain makes you desperate for a fix. If someone acknowledges your pain, then they’re your hero, no matter who they are.1
But if you’re not in pain, or you’re in pain but it’s not because of the establishment in Washington, then President Donald Trump’s inauguration speech was terrible and bizarre and marked the beginning of a very long 4 years that you hope ends prematurely.
I think both sides are right, at least up to a point. Some people are in pain and others are not. What works for half of our nation doesn’t work for the other half. Not just in ideology and blue versus red, but in real life. The policies that work in big cities don’t work elsewhere, and vice versa. I keep saying that the single biggest determinant in who you voted for was whether you lived in a big city or not. It’s true.
With our current political setup, we’ve got one man who is supposed to keep 318 million people in 3.8 million square miles happy. The job has never been harder. In the early 1900s there were about 76 million people in the US. In the early 1800s there were around 5 million. Our population and its needs have gotten exponentially larger and more diverse but we still have one man who’s supposed to keep everyone happy.
I wish that one man the best, because he’s going to need it.
Conversely, if someone doesn’t acknowledge your pain, then as far as you’re concerned they’re part of the problem, no matter what great things they have done in other spheres. ↩︎
After a long introspective period and several attempts at survival for the service, I have to sadly announce that Avocado will close its doors.
It has been a considerable joy and privilege to help make an app that’s been useful for so many people. While I still feel the inspiration to make something to support relationships with technology, the financial and market realities in this space are tough to surmount. Over the years a wonderful team put in hard work toward sustainability but…we couldn’t overcome those challenges.
Avacado was a dating app with more whimsy than any other I’ve seen. It was incredible. My wife and I used it when we first started dating, and it has a special place in my heart. My favorite feature was the hug: you held the phone to your heart until the device vibrated, and then the recipient got a notification that you’d hugged them.
So sad to see this app go, even though the writing has been on the wall for some time.
Spoiler alert: there are none.
I was once interviewed by a prospective employer in a coffee shop. It wasn’t horrible. It was fun, actually. I was probably 19 and I didn’t care. I also turned down the job.
But man, I love a good rant. Go read this.
Professor Jeremy Frimer, transcribed:
There has been some reflection on that from the left in terms of the Rust Belt and the white working class after the election saying, “Man these people are hurting, and their grievances are real and legitimate and Trump spoke to them and I didn’t see it, but now I do see it, and I can see how ignoring them - how that might make them feel. And I can understand why they might have voted for Trump. Maybe it wasn’t just racism. Maybe it wasn’t just sexism. Maybe they really are struggling, and he offered to help.” And so it softens your opinion when you actually stop and listen to the other side.
Yes, making sexist comments is deplorable. However, someone who makes a sexist comment is still a person. And they’re coming from somewhere. And I think ultimately just labeling them as a write-off is not helpful. I think the better place to go is to have a conversation with them and understand where they’re coming from. And it’s hard to do because they’re saying terrible things. But if you can understand where their concerns are then maybe you can have a conversation and allay some of their concerns so they don’t feel the need to go to that place. Because ultimately this is a human being that beneath the surface is just like you and me; they have concerns […]
This is a great episode and I recommend listening to it in full.
I’m not happy with someone who tweets F bombs being our next president. But most — not all, but I truly believe most — of this man’s voters put him in office despite these things, not because of them. Millions of people voted for Trump because they firmly believed he would be in the best interests of the United States that they knew.1 You don’t have to agree with them. You can very strongly disagree with them, even. But there’s a way to disagree while still respecting and engaging. People like President Obama and Tim Cook are exemplary in how to do this. I wish I could engage with people I disagree with as well as they can.
The United States that they knew is the key here. Geography more than anything else determined who people voted for in 2016. Look at a map. There are two countries within the United States: those who live in big cities, and those who do not. ↩︎
Every nerd I know can name a component of their workstation that they feel is indispensable. […]
For me, that one special thing is my Apple Thunderbolt Display.
I use my Thunderbolt Display every day. I love it. There’s nothing that touches it. I’m not going to go with some LG trash. If Apple doesn’t come out with another monitor, well, this is the one I’m going to be using for the next 10 years. If Gruber can do it with the keyboard, I can do it with the monitor.
On election night, the nation was in shock as the election outcome unfolded differently than had been promised. Political advisor David Plouffe tweeted:
Never been as wrong on anything on my life. Still a beating heart in WI and the 2 CDs. But sobriety about what happened tonight is essential
Read the replies to that tweet. People were furious at how wrong the mainstream political coverage was in its prediction.
But now, we’re seeing this sort of thing from John Aravosis:
The polls predicted Hillary winning the popular vote by 3%. She won it by 2%.
The narrative has changed, seemingly. How do we reconcile these two mindsets?
Here’s how I see it. The polling data was actually pretty accurate. The outcome was within 1-2% of the prediction. But the way the media packaged the polling data was off. The media didn’t say, Oh man it’s going to be really close. We think Hillary is barely going to win, but there’s enough room for error that we could be wrong. Instead it said on the morning of November 8, The election is over and now the only thing that’s left is the formality of the actual voting. I couldn’t find it 12 hours later, but I actually saw a headline in Apple News that said that. I’m sure you saw similar.
Here’s the thing. You’re either right or you’re wrong. If you’re wrong, it doesn’t matter how close you were to being right. You’re still wrong. Now, if you’re wrong about something you made clear you were uncertain about, then people will trust you next time. If you’re wrong about something you were highly confident of, they won’t. You can eventually gain that trust back, but it takes time.
This isn’t about the polls being just 1 or 2 percentage points off. That’s forgivable. This is about the media being highly confident about something about which it was wrong. That’s unforgivable. What the mainstream media predicts about the next election isn’t going to matter to a huge portion of the populace. Why should it? The media’s interpretation of poll data has lost its credibility — not because it was wrong, but because of how confident it was that it was right.
We have a presidential inauguration coming up in two days. Maybe you’re excited, maybe you’re appalled, or maybe you were one of those months ago but now you’re indifferent. The guy’s got issues, but unless you choose to leave the country, he’s going to be your president. Some bad people put him there, some good people put him there. The red blindly defend him and the blue blindly condemn him.
What I’m asking is that you choose to be purple in 2017. The only way you’re going to be able to do that is by mixing two colors, two perspectives, two sides to the same coin. If all of your surroundings are one color, that’s going to be hard. In some cases, it’s going to be almost impossible. But give it everything you’ve got. Be a grownup; think critically and independently. Understand the concerns, priorities, and fears of both sides.
For quite some time, I’ve been using a pitch black wallpaper for my home screen on my black iPhone 7. The focus it brings to the colorful apps is remarkable. Recently, a colleague prompted me to go with this same approach for my Mac. My Mac’s wallpaper isn’t pitch black though. It’s a slightly transparent version of black, so that it is completely indistinguishable from the background of macOS’ menu bar. Here are the steps involved to pull this off:
- Go to
System Preferences -> Generaland checkmark
Use dark menu bar and Dock.
- Go to
System Preferences -> Desktop & Screen Saverand select in the left pane
Apple -> Solid Colors. In the ensuing pallet, click the
Custom Color...button in the lower right corner. Choose the
Grey Scale Sliderin the new window that appears, and set its brightness to 9%.
I’ve been doing it this way for about a week now and I’m not going back. I really love it. It lets me focus on the apps that I’m using and reduces the noise. You don’t realize how distracting wallpapers are until you remove them.
Wallpaper isn’t used in homes any more. Why would you use it with your electronics?1
I’m still using a wallpaper on my iPhone’s lock screen. I view this as a totally separate thing. It makes sense having one there. ↩︎
As part of my long-term strategy to achieve immortality, I’m building a permanent digital record of my life online.
There’s the optimistic approach, the in-between, and the pessimistic approach about how the technological future could turn out. If Scott is an optimist and most people are in-between, then the pessimist predicts that due to a series of unfortunate and catastrophic events, the Internet as we know it is suddenly wiped clean, forever gone. Only its memory lasts in the minds of an aging and dying populace who once knew it, and we stumble backwards 500 years in human knowledge, technology, and innovation. It wouldn’t take a very large group of well-researched and desperate men to sabotage an awful lot of the Internet in a single day. We have yet to see a 9/11 attack occur in the technological sphere. Also, we haven’t had a World War with modern weaponry, either. There’s the very real nature of EMP’s - a single one of which, if powerful enough, could destroy all electronics in the world instantly.
If I had to put my chances on the likelihood of Scott’s dream being fulfilled, I’d say it’s roughly equal to the chances of a technological doomsday scenario occurring.
No, monetizing is that word we need to explain how Facebook makes money. They’re monetizing friendships and privacy. Twitter is monetizing clever quips and the latest freak-out over Trump (often the same thing). Snap is monetizing looking silly to your friends with branded filters.
DHH is writing this on Medium, a free platform that’s desperately looking to find a way to monetize itself. Maybe he wasn’t the one who made the decision to use Medium. I rather hope that’s the case. I find it irksome and hypocritical when people criticize that of which they are recipients.
Here’s how I see it: ideally, some software is free for beginners, and then it’s paid to get access to all of the features. Sites that do this well include Github, Strava, Slack, MailChimp, and Chess.com. Then, ideally, the rest of software costs money for even the entry level. Think Zwift, Harvest, AWeber, and Basecamp. A lot of B2C software is the former and a lot of B2B software is the latter.
DHH goes on:
It wouldn’t surprise me if twenty years from now we view the likes of Facebook with the same incredulity we do now to smoking: How could they not know it did this to their health?
The funny thing is, if Facebook wanted to play by DHH’s rules, all it’d have to do is remove its data collecting and simply charge a price for a premium membership tier. Even if Facebook were to do that though, that wouldn’t fix half the problems DHH is describing. It wouldn’t fix the fact that users would seek to create an “echo chamber timeline” that would result in a “narrower field of vision.” Facebook is deeply flawed, and changing its business model would fix some things, but I’m not convinced it would fix the underlying problems.
I’ve got to say though. If Facebook didn’t exist in 20 years and was completely replaced by better things like micro.blog, I would dance with joy.
Just 12 minutes before publishing my thoughts on switching to a Mac Mini, Mac Stories published a piece from Stephen Hackett detailing the family history of the Mac Mini, and where it’s hopefully headed. His conclusion:
From its humble beginnings as the BYODKM Mac to its role as a server, the Mac mini has been a faithful workhorse for 12 years now. It deserves another chance.
I couldn’t agree more.
I’ll always associated larger Apple products with work and smaller ones with leisure, and lately I’ve been thinking about taking this to a new level. I’m thinking about selling my MacBook Pro and switching exclusively to a Mac Mini and an external monitor.
My workstation for the past 15 months has been a Mid 2015 15” MacBook Pro, entry level. It has 256GB SSD, 16GB ram, 2.2 GHz Quad-Core Intel Core i7. It’s served me well, but my dev setup is really resource intensive; I could use more power. Also, my SSD is 92% full. I’m due for a new Mac, and I’m trying to decide what to get. Ever since the Late 2016 MacBook Pro was announced, my gut’s been to go with the base model of the upper-end 15”: 512GB SSD, 16GB, 2.7 Quad-Core GHz Intel Core i7.
The problem is, this machine is annoyingly consumer oriented, in my book. I want my money to go towards hardware performance, not towards niceties like the Touch Bar. That’s why I’m thinking about going with a Mac Mini. Rumor has it that an update for it will be announced in March.
A Mac Mini wouldn’t work for people who are mobile all the time, but the vast majority of the time, I’m working at my desk with my laptop connected to an external monitor, keyboard, and mouse. I’m not using the laptop for its intended use. I’m paying twice as much for hardware performance as what I could be paying for a user experience that I would not find to be any inferior.
For those rare times where I am traveling or generally on the go, I could pick up an inexpensive mobile monitor and connect it to my Mac Mini. It wouldn’t be usable in my lap, but it would work great on a desk of any size, and it’d be as portable as a laptop. Also, I don’t enjoy using a laptop in the situations where a laptop is the only option; things like traveling in a car, waiting in an airport, or flying on a plane. It’s next to impossible for me to get serious work done in contexts where a desk isn’t available, so why even bother? Having only a Mac Mini would take that option off the table completely; good riddance.1 Those contexts are what iPhones and books are for.
Another option is to just buy an iMac. But aside from the fact that this wouldn’t be portable in any fashion, it also forces your screen technology to be bound to your computing technology. The beauty of a Mac Mini that it allows you to decouple your monitor upgrade cycle from your computer upgrade cycle. Right now I have a Thunderbolt monitor, but if I later want to upgrade to an LG 4K monitor, I can.
I haven’t fully made up my mind about going the Mac Mini route, and I won’t until I see what’s announced in March. But this is what I’m thinking, for now.
Update: I should have stated at the beginning that the Mac Mini isn’t going to be a serious option until it comes in quad-core. The difference between dual-cord and quad-core is too drastic. We’ll see whether this is an upgrade it gets in 2017 or not.
Obviously everyone’s different, and some people really do have to get work done in situations where a desk and power source aren’t available. I’m just thinking aloud about what works for me, what I actually need and use, and I’m realizing that it’s very different from what I heretofore thought it was. ↩︎
Happy Friday 13. I’m trying to commit this word to memory. Once you know how to pronounce it, you’re halfway there.
That’s how Gruber describes Night Mode, which has been available since iOS 9.3. Night mode isn’t for everyone for sure, but I find it amusing how appalling it is to him. Different strokes for different folks. I have night mode scheduled to turn off at 2:59 AM and to turn on at 3:00 AM. Until 2017’s iPhone comes with natural lightening - and it will - I’m going with the next best thing.
Hat tip to PodSearch for the handy link, by the way. Very cool service. Obviously since this is The Talk Show we’re talking about here, the link serves merely to trigger your memory of an episode you’ve already listened to.
Install this Chrome extension and then head over to Daring Fireball and look in dismay at the comments. They’re all horrendous, every last one of them.
From Dicknose, for instance, on Gruber’s recent AirPods video:
A monkey could figure this out.
Grubers [sic] hands look like he hasn’t worked a day in his life.
If comments were natively built into Daring Fireball and were consequently publicly viewable to all, and if Gruber moderated them, then it’s reasonable to assume that the comments would be kinder and better.1 However, it’s not reasonable to assume that they would be kind and good enough to warrant being on the site. Public comments are a needless distraction. They’re psychologically wearing; when I don’t read them, I walk away with a sense that I’m missing the full story, and when I do read them, I reprimand myself for wasting time.
This is exactly why I don’t use App.net or Twitter or anything else. I use static HTML files and old fashioned email. It’s technology that I’m in control of, it’s been around for decades, and it isn’t dependent on any single vendor.
Ideology aside though, it’s painful admitting that an idea that you had, which you turned into a reality through many hours of hard work, must be buried in the sea grave of unsuccessful software. That’s a sad narrative, and my genuine sympathies go to the talented team. There was a time when I was exuberant about App.net.
I find it ironic that macOS is still showing the amount of time until the battery is recharged. If user action on a computer is a determinant in how long it’s going to take to discharge a battery, wouldn’t you think that it would also be a determinant in how it takes to recharge that battery?
Update: as a reader has pointed out, it isn’t actually clear whether user action on a device would slow down a charge. A MacBook Pro can function just fine without a battery, so long as it is plugged in. I found this helpful tidbit on Quora:
When you are plugged in, your laptop is directly powered by the A/C adapter, not the battery; only excess power goes to the battery.
The MacBook Pro I’m using has a MagSafe power adapter of 85W. I’m not sure how much power my computer actually uses, but I’d be surprised if using my MacBook Pro to its limits would leave enough leftover wattage for the battery to get as much as it would if the machine were turned off. But even if that were the case, it wouldn’t be enough to give credibility to my original point. Not only would it have to be true, but the amount of excess wattage remaining during a heavy task versus a light task would have to vary enough to change the estimated time remaining until a full charge. Is that possible? Maybe. I don’t know.
When I was in college, the joke went around asking how many computer scientists it took to change a lightbulb. The smart answer was zero, because a lightbulb is a hardware problem. I’m a computer scientist, not a hardware technician. I have no idea about this stuff, truth be told.
Regardless, it’s an interesting inconsistency at this point that the duration until a full battery is viewable in macOS but not the duration until an empty battery. I’m glad for it though; I don’t agree that the latter was bad enough to warrant its removal, and even if it were, I’d want the former to remain. Consistency is good, but it comes second to user experience. Knowing how long you have to wait until your battery is recharged is a good user experience, and there are situations where it comes in handy. Here’s to it not being removed.
John Gruber on Fedora Review, 15 years ago:1
I would sooner use a pair of dirty socks to touch my food than use these tongs. And indeed, despite all the “Use the Tongs” propaganda at the Stop & Shop Au Bon Pain bagel kiosk, there is a tissue paper dispenser under the tongs. I, of course, use the tissue paper. But every time I pick up a bagel, I wonder if it has been touched by those tongs. This dictates how I choose my bagels – I pick from the back, looking for the bagels which appear least likely to have been contaminated.
Reading this piece, I realized that it’s not just his writing about technology that makes Daring Fireball so special. Gruber can write. He has something most people don’t. He is to prose what Vladimir Horowitz is to the piano: loathful to adhere to the expected status quo, tasteful, deeply engaging.
It’s a pity Fedora Review didn’t continue. I would read this site. Tastefully written reviews of things completely irrelevant to its audience. What could be better?
A couple of days ago I installed GPG so I could have my commits signed. Just a few tips to get everything working correctly.
- Don’t have both GPGTools Suite installed and GPG installed via Homebrew. Go with one or the other. I went with Homebrew. All manner of havoc will break loose if you do not follow this advice.
- Add this to the end of
- Make sure this line is at
When I’m willing to spend more time exploring, I’ll seek to get GPG 2.x working. All the Github documentation assumes you are using GPG 1.x and I found it easier going with this, although my problems were probably caused by the fact that I was not adhering to bullet #1 above.
Meanwhile, I’ve added my GPG public key to the contact page just for fun. Why not? Send me an encrypted email, and if you attach your GPG public key, I’ll reply in kind.
Here’s a small change to your macOS settings that you’ll thank me for. Go to Mission Control and uncheck the setting labeled “Automatically rearrange Spaces based on most recent use.”
Most of the time, when I’m changing a default macOS setting, I recognize that the default is good for most people and that I need an exception for reasons specific to me. However, this is one of those times where I really don’t approve Apple’s choice in defaults. Having spaces automatically rearrange makes it impossible to have an order to your spaces that you stick with and memorize. I didn’t know this setting existed until recently and I wish I’d disabled it years ago.