Toby's Log page 86

Woo, the Cleveland curse broken after 52 years. Such a close game all the way to the end. Stressful, back and forth. Worked out that the Cavs were ahead at the end. A battle. Something for the Cleveland 52 years in the making. Don’t know if this will change things for the future of the city and area, but it at least changes something of the feeling that being a Clevelander / Northeast Ohioan is.


Cavs

I think I’ve watched or listened to more basketball this playoff season than in any previous entire season. Last year I watched the finals but this year I’ve been more into it. It has been surprisingly time consuming, especially when I get caught up in the pre-/post-game news and speculation. It has been rather stressful, even when the Cavs win, especially these finals. It is easy to get caught up in the game and want the local team to win. It can feel like their identity is part of my identity and their success is part of my success, in part because of the city’s renowned “Cleveland curse” and broader “rustbelt” identity against a desire for the area to succeed. I sometimes have to remind myself that their game is not my game and their success is not my success. It can feel great when they succeed and might have an influence on the city’s morale, but it is unrelated to success in my own life. Just watching the games has been entertaining, some of the plays and records broken have been exciting, and so has how far the Cavs have come against expectations. If they lose game 7, I think it was still worth it to watch. That said, I do hope the Cavs defy the odds and finally break the “Cleveland curse”. Go Cavs.


Stir Trek 2016 talk videos

I’m glad that Stir Trek has released videos of its 2016 talks, since I didn’t luck out in getting a ticket this year like I had last year. At least I will (hopefully) be much less tired watching them than when I drove down to Columbus early morning last year. Of course, the videos are good to have even for people who went: These multitrack conferences always seem to have multiple good talks at the same time, including at least one slot with multiple “must see” ones.


Flight of the Conchords show

Saw Flight of the Conchords do a live show at Playhouse Square to(last)night. The show was opened by Arj Barker (Dave from the TV show) who did a stand-up routine. It was quite funny. FOTC did about half new material, half old, but the old stuff had new parts interspersed. Funny stuff. And not just the songs: The banter in between was plenty funny itself. It was a hard hearing at times when the crowd was loud, but I caught most of it. Glad I decided to go.

Of note, playing at the same time and one theatre over was Opera Circle’s showing of Il trovatore, which my friend / coworker was involved in. I had been torn between seeing the two. I checked less than 24 hours from show time for FOTC tickets but it said they were sold out, so when I left, I had been expecting to see the opera, but intended to ask about FOTC tickets. When I arrived, tickets were in fact available. The cheapest available was $55, so I was ready to pass on that and started asking about the opera when the lady said there was a free ticket that somebody had returned and that they couldn’t charge for that I could have. I couldn’t turn that and a rare chance to see FOTC live down. Also, I had arrived late for the opera because the rapid took longer than expected. I’ve seen many of Opera Circle’s operas since meeting my coworker. So, sorry Wanda, I’ll see the next one.

I took the rapid downtown because I don’t like dealing with traffic and parking there and for one person it’s hard to compete with the price. I was a little anxious walking around downtown because of an event that happened last year, but nothing happened this year. It did, however, take some waiting.


Algorithmically derived passwords

I’ve been considering a new password storage method for a while now. Currently, I have a system where I compose passwords of pieces of several different values that I have memorized. Each value has a key that I have associated in my head to the value, and I have a file with the keys for each site. Lately, I’ve also been doing part of the password as something derived from the name of the site. This has helped somewhat with making the passwords memorable, but I still frequently have to look at my password file. If someone got a hold of this file, it would take some dedication and knowledge of me, or at least access to the plain texts of some of the passwords, to crack the system. Nevertheless, I’ve been looking for something easier to use and preferably more secure at the same time.

I’ve been looking at options like YubiKeys and 1password, but they have their issues. Today, I came across a cool option wherein passwords are algorithmically derived from a single password and the site name. This is sort of like what I’m doing in my head for some of my newer passwords, but much more advanced, able to produce hash passwords of a desired length and even with character constraints. I read about the idea from a post by Tab Atkins, who has his own solution freely available to use. The comments on his post also led me to SuperGenPass, a similar idea.

Both are of these options built purely using web technologies, making them easy to use anywhere. Both are open source, so I can check their code, verify they are doing what I want, and modify them so they can be slightly different. Neither need to store anything (unless you change the config for SuperGenPass) or require accounts. They have an option to work as a single-page file that can work even offline, wherein you type your master password and the name of the site, and they will give you the password in plain-text to copy elsewhere. SuperGenPass also has a bookmarklet option that can be run from the page you are entering the password on (obviously web only) by using an iframe (requires a third-party server) that can put the password directly into the password field, bypassing the need to copy the password at all.

So these are definitely interesting options to make working with passwords much easier, and the passwords I have for each site can be more complicated and theoretically secure than what I use currently. The biggest danger would be, if someone figures out my master password and what settings I’ve used for the generators, they will have access to any passwords I’ve created. I would probably do multiple master passwords, at least a normal one and an extra-secure one, just to limit the number of accounts they could access. I could even throw in a simple modification to the master password derived from the site name to make it even less of a problem.


Wednesday evening, as an addition to my regularly scheduled shift, I got to play locksmith, first attempting to fix the lock on the office door, which wouldn’t lock for me, and then replacing it with the old one.


Bird rescue attempt

Today (yesterday), I spent around three hours getting a bird detached from a tree in the back yard. It was just hanging there from its leg, and it was hard to tell how it was attached. It was pretty high up and not reachable with any pole-like implements we had available from my ladder. My roommate helped for a while, but had to leave for work. I eventually succeeded in getting the bird down using two ropes, one put in place by throwing with a Kong dog toy on the end, the other raised with a garden tool tied to a stick. The ladder wasn’t entirely stable with the long combined pole, and the rope-throwing took many tries. I used the rope to break the branch, causing the bird and branch to fall. The branch was still alive, so the operation took a while even after I got both ropes in place. I set up some yard-waste bags and a towel below the bird to soften its landing.

Continue reading post "Bird rescue attempt"

Dreamhost must’ve had an outage of some sort this (last) morning. I noticed a little after 11 that I couldn’t upload anything to or log into my (shared) server. My sites were inaccessible. I tried the sites of a couple other people I know using Dreamhost (also shared), and they were also inaccessible, so it must’ve been something somewhat significant. Strangely, nothing relevant was on Dreamhost status. I tweeted about it at 11:20 and got a response from DreamhostCare that they were looking into it. They didn’t say anything more, but I noticed things were up and running again around 11:42. I found later that it must’ve been a DDoS on their nameservers. Outages have been rare, but certainly annoying when they happen.


Duplicate selectors: Increase specificity without being more specific

CSS has a concept of specificity wherein more “specific” selectors take precedence over less specific. Sometimes specificity rules cause a set of property values to be applied while another is desired. This can result in the developer increasing specificity on the desired set to outweigh the other set. When I’ve needed extra specificity, I’ve often use an ‘html’ class on the <html> element or a ‘body’ class on the <body> element. The downsides of are it:

  • is more specific, as in precise, meaning the selector won’t match in a document without those helper classes.
  • has a performance penalty for needing to check a(nother) parent element of the target element.
  • only allows one more unit of specificity at the class level for each parent used.

Today (yesterday), I found a better way that can add any amount of class level specificity (weight) without being more specific (precise), thanks to CSS Wizardry. I’ve been doing this CSS thing for a while, but I hadn’t realized .foo.foo would match <div class="foo">. In essence, you can duplicate a selector and chain it onto itself to create an equivalent selector, but with double the specificity. You can duplicate it as many times as needed to get the desired specificity, e.g. .foo.foo.foo.foo to override .foo.foo.foo, without requiring any parent selectors. Besides the benefits already mentioned, it could be seen as more explicit in its purpose than using parent elements, because there is no other reason to do it. I will have to start using this.


</toby>