An open letter to advertisers who target millennials

That is, my age group. Everyone’s trying to sell stuff to 20-year-olds. Seems like they think we’re easy to sell to.

…We are not. We’re broke. But if you still want to, here’s what you’re doing wrong.

It’s been said over and over again that millennials hate ads. I’m going to argue with that a little bit. Everyone’s been freaking out that “traditional advertising” no longer works,  but they haven’t actually bothered to figure out why, and instead they’ve jumped into wild guesswork about what these young whipper-snappers will respond to (read: fork over money for) instead.

Here’s what we actually don’t like: being manipulated. Apparently many advertisers have forgotten that ads don’t have to be manipulative, so they’ve come to the conclusion that they don’t work at all, instead of considering that their favorite tactics are actually counterproductive.

(And yes, this seems a little tangential for this blog, and a little ranty, but it’s important because programmers often rely on advertising as a sole or main source of revenue for websites and mobile ads–and we sometimes have to advertise our own products as well. So, this post is for anyone purchasing advertising, in the hopes that they’ll end up making a better revenue source for programmers. We both profit if you guys get this right.)

Overexposure

Have you ever heard of a game called Choices? Or Episode? (I’m not totally clear on the difference, if there is one. I’ve never played either.) If you haven’t, you’re almost certainly not a 13-30 year old girl, or at least you haven’t been revealed as such to advertisers. Even people outside this demographic probably remember at least one ad, because they’re among the most obnoxious I’ve ever seen. Usually they’re about people cheating on each other, but sometimes they feature people breaking up with each other, interactions at a party (usually involving making out with a stranger), or a woman getting pregnant. They’re always demeaning and ridiculous.

stupid-ad-1

 

Yeah, that kind of ridiculous.

I’ve seen these mocked endlessly by people who use the platforms they advertise on. You’d think this attention would be good for circulation, but instead it’s become the app you don’t want people to find on your phone, ever. I’m sure some people played it to find out just how ridiculous it was, but with a free-to-play, you’re only going to get money from people if they’re actually invested in your game. Mockery is not investment.

Or take the ads for The Walking Dead, which are so prevalent on Tumblr that they were turned into a meme, which then spread to Walking Dead ads which didn’t feature this scenario, and then to just about any ad on Tumblr users considered annoying enough.

First they came out with this one:

child
One meal, two lives. It’s the zombie apocalypse. What’s your choice? Feed yourself or feed this starving child?

At first the ads were ignored. But they became so prevalent and so obnoxious that you’d run into one after every five posts (on mobile, at least–that was my experience). Eventually users got (heh) fed up with them, and started responding.

But not in the way the advertisers wanted.

With comments like, “that is the ugliest kid I’ve ever seen” and “why can’t they split the meal again???” the ad was… less than successful. But, figuring out that gamers actually care a lot more about virtual dogs than kids, they tweaked the ad to manipulate a better response, and replaced the kid with a dog.

That’s when this happened.

dog
Feed yourself or feed the dog? Well… at least the dog won’t starve one way or another.

If you can’t read this, it’s the same picture but with a dog as the second option. The first comment is a more genuine response, “feed the dog” (I’ll spare you the all-caps on these), but the rest urge the suggested action to either feed the child from the previous ad to the dog, or cannibalize the child yourself and let the dog have your TV dinner.

And thus, Eat The Child became a meme. No Tumblr ad was safe, at least if they left the comments or reblogs on. If users don’t like your ad, they’ll spam it with Eat The Child in any way they can, or ignore you, or even block you (sometimes this is possible on Tumblr). After all, from their point of view, what you’re sending them is spam.

So how do you become not-spam? Why are advertisements so reviled? I think the main problem comes down to this.

This is what your ad looks like when we’ve seen it ONCE.

Have you heard about our product? (showcases product)

See, this is good! Business philosophy should be based around, “We’re making something you want. Will you trade us money for it? It might be a good deal for you.”

This is what an obnoxious ad looks like around the thirtieth time we’ve seen it that week:

(advertiser's face pressed up against screen) HAVE WE WORN DOWN YOUR WILLPOWER YET???

This business philosophy says, “We’re going to ask again and again until you give in and hand us money. We’re trying to psychologically manipulate you.”

Yes, manipulate. We all know about the theory of ego depletion, we know you’re trying to use it against us, and it pisses us off.

Now, how much control do you have over this? I’m not entirely sure, as I’ve never bought ad space, but I think it’s a fair amount considering that I see some ads WAY more frequently than others. Apps and so on may be offering bulk deals where your ad gets way more screen time. Don’t take them. They’re counterproductive.

Also, the more narrowly you target your ads, the less likely it’ll be that that exact audience has a lot of other advertisers after them. So those users won’t see as wide a variety of ads, and every slot will be filled by yours. This is not a good thing. You have to strike a balance.

Users seeing your ad once a day is a good thing. Users seeing your ad once every twenty minutes is a bad thing. It doesn’t scale the way you think it does. Ego depletion only works if the prospect is actually tempting. (Even then, the theory seems to be under dispute. Psychology is a very young field and it’s hard to draw firm conclusions quickly.)

Once your ad becomes overexposed, you’re also about to hit a wall with word-of-mouth. Say a user does give in to the ad they’ve seen twenty times, and they actually do like your product. Are they going to tell their friends about it, and give it a good recommendation? No! They’ll assume their friends have already seen the ads, far too many times. Not only will bringing it up be considered annoying, but you’ve done all the selling already–and less effectively than your user would have.

It’s cool to find something that not everybody has seen, and introduce it to your friends. It’s not at all cool to introduce something your friends are already sick of seeing.

Now that I’ve hashed out why overexposure is bad, let’s talk about audio.

Audio

There’s some Harry Potter game that came out in the last two weeks. I already hate the ad. It barges in with “YOUR LETTER HAS ARRIVED!”

First of all. No it hasn’t. I’m twenty years old. Yes, I grew up with Harry Potter. Yes, I like the story a lot. No, I don’t want to play your Sparklypoo game. (Seriously, the protagonist of the ad and the comic look identical.) You should be marketing to 13-year-olds, or focusing on the story in the game rather than the appeal of YOU GET TO GO TO HOGWARTS IN A DUMB PHONE GAME, WOWWW.

I recommend instead that you put instrumental music in the background.

Why? That doesn’t say anything about what my product is.

Exactly. Users who have put down their phone to listen to something on YouTube (this is where a lot of these ads live, that and Instagram) have to pick their phone up to figure out what the music is. It’s an instinctual compulsion, but not one that will make people mad if you use it. It’s called curiosity. Instrumental music is kind of like clickbait, but way less annoying. And if they’re already holding their phone anyway, the music is less likely than other audio to be irritating, or worse, embarrassing if there are other people around.

But only if the music itself isn’t annoying. How do you know? Make a playlist on your phone and make the music you’re thinking of using come on every two or three songs. Just the clip you want to use, not the whole song. Keep your headphones in all day. If you don’t hate the music by the end of the day, if it’s still inoffensive, then it’s probably fine.

Condescension

“But I like classical music, and my target audience is teenagers, so I should use pop music.”

Please don’t. This line of thought is condescending. Classical music is inoffensive to most people, including teenagers, and pop music (or what you might think of as pop music) is irritating to many people, also including teenagers.

This ruins a lot of ads.

Also, I can’t remember where I heard/read it, but it’s accurate if you pay attention: ads targeted towards men center around the message, “you’re awesome, and so is this product!” while ads targeting women focus on the premise, “you’re not good enough, and this product will fix you!”

This isn’t a millennial thing specifically. It’s just really annoying and manipulative in general, and people are picking up on it. Millennials don’t like manipulation. Cut it out.

Memes

The above point, condescension, is closely related to the clumsy use by advertisers of memes. Done right, memes can be effective advertising. Done wrong, they make you look like you’re trying really hard to “Relate.”

The thing is, memes are jokes, and even trickier, they’re in-jokes. If you’re actually in on the joke, then they’re awesome. Take the Denny’s Tumblr account. They’re not paying for ad space. It’s a normal Tumblr account. Which people willingly follow. Why? Because it’s actually hilarious, for a bizarre surrealist form of hilarious. Like, what the heck is this?

The exfoliating properties of the meat loofah and marinara soap are ideal for body and facial cleansing.
I’m… not entirely sure what they’re trying to say here. It’s pretty gross.

But I follow them for that grossness. It’s weird and kind of off-putting, but that’s what makes it so great. And they can get by with this while selling food.

If you’re not one of the people who inhabits memeing spaces enough to find them funny, you’re probably going to get memes wrong by imitating what you think is the joke, and getting it wrong or overdoing it.

There’s a very delicate balance you have to strike to get it right, and a lot of components.

The graphics and editing you use should often be kind of crappy, but not so bad you can’t read them or you look like you’re trying too hard, and even then there are lots of exceptions. If you use the wrong font, it’ll be obvious–some memes use Impact, others use Comic Sans, others use something entirely different or you have to actually alter the words themselves in Photoshop to make them look worse. Many memes are self-deprecating or look intentionally bad, but you can’t go too far in that direction either.

If you try to imitate an existing meme, you run up against its lifespan. Often by the time an advertising committee hears about a meme, it’s kind of old, and they tend to have a half-life of about six weeks. Let’s say you have your ear to the ground and you hear about it in two. Next month, it’s going to get kind of stale. Are you willing to change your advertisements that often?

And if you get it wrong, you end up with this steaming dog turd:

totinos

We didn’t take kindly to this. Let’s dissect it.

  1. Colored Comic Sans. Would be fine under some circumstances, but not these; the third line isn’t easily legible because of contrast issues.
  2. Awful stereotypical girl names.
  3. Nobody talks like that. Not even groups of interchangeable high school girls in push-up bras.
  4. This isn’t remotely close to how you use “wow I cannot even”. Slang has rules which are just as strict as normal grammar.
  5. Why are they drinking out of mugs? What is that, black coffee? Or are they drinking Sprite or milk out of a mug like someone’s dad?
  6. You can’t see it in this image, but they usually tag their own posts with things like #lol and #dank memes and #fun. Nobody browses those tags, except maybe other 40-year-old advertisers who are “trying to figure out what the kids are into.”
  7. People only use the phrase “dank memes” ironically, and ironic usage is a whole other pot of “you’re gonna screw this up.”

People REALLY hate Totino’s ads. The responses can get pretty hilarious.

Names and avatars blocked out to protect the guilty. 🙂 Also a few words, to keep things PG-13.

This response isn’t universal across all of Totinos’s ads, though. Sometimes–more often recently–they get it right and people remark how they’re “getting closer to Denny’s.” This is because anyone, of any age, who hangs around memeing communities long enough will pick up the aesthetic and the sense of humor. But it takes a while.

wannabe

The easiest way to go about it is to hire some college grad who needs to pay off student loans, and have them turn out memes for you. You don’t need to have a degree to be good at this job, but it helps your chances of snagging one if they’re a little desperate to find work because they have bills to pay. Don’t go straight for the marketing majors, they’ll demand more salary than… let’s say, a half-stunned English major who walks into your interview with a brown belt and black shoes and dark circles under his eyes. If he responds to an awkward silence by giving you finger guns and a worried smile, that’s the kind of person you’re looking for.

That brings me to my last point.

Staff

If you’re trying to appeal to the 18-24 year old audience, hire some. At least several of them. There’s probably a college or two nearby, just go grab as much of an assortment as you can by bribing them with money and donuts. There’s always some department in every college that’s trying to get the students hired, so just find yourself a few interns and ask them to improve your targeted ads. And then let them.

You folks could be doing so much better, in terms of quality and in terms of the response you’re receiving. No, your old tactics aren’t working any more. But you’re adaptable. I know you can do better.

Hugs,

Your targeted customer

 

Advertisements

Cheap book alert!

Humble Bundle has an O’Reilly ebook bundle on functional programming and you can get 15 books for $15! It’s not just Clojure, there’s JS, Scala, Haskell, Rust, Erlang, and what looks like some general purpose functional books.

https://www.humblebundle.com/books/functional-programming-books

Not an affiliate link, this is just an insta-buy for me 😮

It’s always worth supporting Humble Bundle. Every purchase donates to charity. Usually you can choose either which charity, or how much is donated, or I think sometimes both. Depends on the deal but they’re good charities and the default is at least 10% donated. You can sometimes knock it down to 5% if you want to keep the other 5 for yourself as store credit.

Remember they’re ebooks, not print. So you can read them on your phone or computer or Kindle or whatever. Much lighter than normal textbooks. But if you’re hoping to make a bookshelf look impressive it’s not the deal for you.

Anyway, go learn functional programming! It’s survived for like 66 years, it’s not going away.

“Earning Your Labels”

I was just thinking about the problem of elitism and gatekeeping in the software community. You know, the “You’re not a real programmer if you [use Notepad++ || only code HTML and CSS || only know Python || still use Windows* || don’t know about pointers || etc etc etc]” arguments. Most of these boil down to a pissing contest about who can use the most bare-metal programming language, do bit hacking to get a .0001% increase in optimization, and use butterflies and wind currents to write their programs.

So, where do you draw the line? How high-level do you have to get before you’re “not a real programmer?”

*Yes, I’m a huge Linux advocate. But come on. You can program on Windows. I personally don’t like doing so, but you can. If you have to bash on someone for using the OS they know, it’s because you don’t feel secure and confident in your own skills. All you’re doing is making someone feel like trash and not want to build things. It’s immature. Cut it out.

“Real Programmers”

My initial reaction is that, if you’re writing code to tell computers what you want them to do, you’re a programmer. It doesn’t really matter how many layers of interpreters you have to go through in order to get the command across. My line is drawn past HTML and CSS (that’s legit), but before Wix (…picking out templates and arranging a couple things isn’t programming).

What about Dreamweaver? Are people who use Dreamweaver programmers? (If you’re cringing, it’s likely that you too have not used DW in a few years. I hear it’s gotten loads better. I still don’t really want to touch it though. Bad taste in my mouth.) They don’t necessarily write the actual code, but they are using creativity and intelligence to design and build software.

If you think the answer is, “No, that’s not real programming,” I ask: what about programs that are really just a couple library calls stitched together? Sure, you have to know what they do, and you have to know some syntax to get them to cooperate, but is that programming? What about Excel macros?

This is a loaded question, because “programmer” is an identity tag. There are lots of levels of expertise, and people put loads of effort into learning. They also often put effort towards feeling “worthy” of the label. To let someone else just waltz into your exclusive club of smarty-pantses feels like some kind of glove-slap. But the only reason you feel weird about letting someone new use the label is that you wanted to be in the club at one point, and whoever was already in it made you feel weird about not having “earned it.”

This is pointless. It’s so pointless.

My line, you might notice, isn’t too solid. Programming is a skill; to call yourself a programmer you need to have invested at least some time in learning how to tell computers to do things. But telling computers to do things is something everyone does, every day, even if it’s just posting cat pictures to the Internet, and one intuitively understands that that isn’t programming. So the definition of programming needs another component: you need to be using creativity to build something for someone else to use. I also feel that you should have a modicum of closeness to the computer that the average user doesn’t see. You should be typing code of some sort, probably. Yet at the same time, I still consider Scratch users programmers.

This is all kind of weird and fuzzy and I don’t have a solid answer. Let’s aim for something a little more abstract.

Who is a hacker?

This feels like easier ground to cover. A hacker is someone who makes stuff, usually involving code, always involving skill and creativity. So, can Dreamweaver users be considered hackers?

I’ll admit, I have the same knee-jerk response of “no,” but that’s because of my opinion on Dreamweaver. Let’s try again.

Imagine there’s a program that uses a WYSIWYG editor to build software, and it actually somehow outputs sensibly-structured code, which you may or may not have to tweak. You need to tell the computer, through this editor, how you want things to look and what they do. You design the visuals and the functionality yourself. Is someone who uses that a hacker?

I think yes. I wouldn’t consider that person a programmer, unless they tweak the code a lot. But I’m willing to extend the label “hacker,” because they’re building something.

In other words, I consider designers hackers, particularly if they design functionality as well as visuals. A solid designer goes to a programmer with a startup idea and both contribute substantially to the final product–they’re both hackers.

So, I know there’s someone who disagrees with me. ESR’s “How to Become a Hacker” FAQ, linked in the sidebar from the glider symbol, outlines the minimum amount of skill he considers necessary to be thought of as a hacker. In fact, my most well known post is linked from that guide, and probably a lot of my readers got here from it. But I think the guide, while helpful and very much in the hacker spirit, is showing its age just a bit.

C, Java, Perl, Python, and Lisp as the main languages one should learn definitely seems outdated. These days you’re probably better off learning JavaScript, Kotlin instead of Java, Ruby or Python, and… probably still Lisp, but maybe one of its dialects. And HTML/CSS, obviously. A lot of us don’t need C any more. Software is increasingly hosted on browsers rather than as native applications, and if hacker time is valuable as the guide purports, then we shouldn’t spend it on learning a complicated language unless we intend to specialize in what it’s used for. There’s something to be said for knowing the kinds of things C would teach you, but it seems a stretch to call it necessary.

Similarly, I think using “hacker” as a title of status isn’t the way to go these days. Once upon a time, this made sense. These days… ehh. Especially the idea that you have to be called a hacker by a “well-known” hacker. The community’s too big for us to define that kind of thing now, so what it gets us is this scrabbling for status, and an in-group. Does making “hacker” a title help our cause? Does making an in-group speed the creation of open source software?

It did once, because few people wanted to be part of the group, and hackers needed something to be proud of because no one else held them in high esteem. Times have changed. It’s more important to welcome new members than it is to describe an elite.

Open Source

But wait, don’t you have to contribute to open source to be a hacker?

… Good question. It’s been part of the definition for ages, and for good reason. But we have to respect that in a lot of hackerly fields, there’s not too much of an open source infrastructure. There’s no official way, that I know of, to go post a functionality spec and some back-end code online and say, “I’m looking for a volunteer visual designer for this open source project,” and get one. There’s not really a GitHub that’s geared toward PSD files and so on. I think Adobe’s been trying to make something similar, but I have my doubts about the prognosis of this attempt. (You can use GitHub for this kind of thing, technically, but it’s not usually done. There’s no community that thinks of that as normal. Ergo, no infrastructure.)

I think we can loosen up on this a bit. Let’s say that, in order to be a hacker, you have to be helpful to other hackers. Open source produces better software, yes, but much of its function is just hackers helping each other build stuff. This addition to the definition seems to cover the chaotic-good benevolence that’s kind of integral to hackerliness. So, if you’re a designer and you provide friendly critique and advice, write articles/draw tutorials/make videos about your work, contribute to public domain or creative commons, maintain documentation for your tools, or anything that kind of runs in that vein, consider yourself a hacker.

So, to be a programmer, I think you should be writing code. Don’t care what language, what editor, or even if it’s very good quality code.

To be a hacker, you should be using your skill and creativity to build new things, usually with code but not always, and you should help others who are building things.

One more thing. Your learning shouldn’t be stagnant, for either category. A programmer who’s been spewing out bad COBOL since the eighties is barely a programmer, and definitely not a hacker. And a hacker who’s not refining their technique through focused practice and frequently poking their nose into the bandwagon to see what’s up with the newest shiny tech or trade technique or tool, is just… not a hacker, by definition. You have to be building stuff to be a hacker, which means you’re refining your skills as you do so. Technically, you could be writing the same CRUD app over and over and over again, but even that has some room for improvement, and it’s really really hard to imagine creative builder types limiting themselves to that for long.

These are just my views. I think a lot of the friendlier people in the community would find them reasonable, though.

Competition and impostor syndrome

I think the lack of this elitism is why I’m happier in design classes than programming classes. It’s not that I like design *better.* It’s that I don’t like programming classes, because the air tends to be heavy with the feeling of judgment.

Ever try asking a question on Stack Overflow? 80% chance you’ll be told it’s a stupid question, you aren’t using the right language and framework, you should have Googled x y and z, and your coding style is crap too while we’re at it. Sure, you shouldn’t ask other hackers to fish you answers that out of the API docs, and if you’re asking other people to read your code snippet then you should bother to indent it and comment anything that’s not obvious. I’m not talking about stuff like that. Even questions that seem to me perfectly legitimate get this kind of treatment. It’s just hostile and unnecessary.

I’m a girl, so this judgment stuff goes double for me. I’ve had people kind of assume I’m studying CS/programming because I want to snag a guy who’s going to get a good job, so if I wanted to prove my legitimacy, I’d have to work extra super hard. Some people are just… like this. You can try to live up to the ridiculous standards they set for you before they’ll give you basic peer respect (but you stay probation indefinitely–there’s no good faith if you have an off day), or you can ignore them, which usually just makes them more angry and hostile. I tend towards the latter because the former is too much work, but keeping them out of my hair feels like defusing a bomb. Everything I do is examined extra carefully, in case there’s something wrong that can be pointed out. The same thing happens to anyone who isn’t white, especially if they’re self-taught. You’re incompetent until proven otherwise.

Designers are more chill–they aren’t so competitive they feel the need to cut each other down. There’s still a lot of skill-based insecurity and comparing everyone else’s work to your own and trying to improve, but they aren’t nasty to each other the way so many programmers are. I can relate. No matter what, I always feel like I haven’t built enough, I don’t know enough, I’m not good enough. This isn’t new, really, it’s just something creative types have to deal with and practically everyone has the same self-dialogue. But when people can’t deal with this by dismissing it and just continuing to work, and instead decide to handle it by throwing their insecurities onto someone else, you just get this warped echo chamber of impostor syndrome and a really toxic environment.

I think a lot of the problem is that not everyone in the field realizes we’re a creative field, not a science or engineering discipline. If we all knew that, we’d know about this dynamic already because all the other creative fields have already been there and done that and learned how to deal with it. I happen to be an artist, writer, musician, cook, and designer as well as a programmer, so I’m fortunate enough to recognize what’s going on, which gives me some protection from it. (Notice I’m not a mathematician, chemistry whiz, or engineer. Funny how that works, isn’t it?)

Where’s this coming from?

This train of thought is heavily inspired by my getting more into freeCodeCamp. I’ve been reading their Medium articles off and on for… gosh… two years? Or more. But I didn’t get involved with the actual organization or their tutorials and stuff, because they’re very web focused and I wasn’t sure that was the direction I wanted to go. (If you’ve been here a while, you might remember I was more focused on mobile apps and desktop programming.)

Anyway, fCC is very anti-elitism, and I really like that about them. It’s why they’ve always been my favorite Medium column/publication/whatever they’re called. It’s very much an “anyone can learn this, yes you are smart enough so no excuses, come here and I’ll show you” organization.

I think this helpful non-competition is something the rest of the field should adopt. It’s not going to devalue your skills, because they have objective value. You can still be proud of how much you know without being a jerk about it. Actually, truly skillful hackers will be held in even higher esteem if more people learn to program, because they’ll recognize the difficulty of what you’re doing and want to emulate you.

Can we just drop the popularity contest?

Someone out there is going to read this and think I’m not a senior enough hacker for this request to be taken seriously, that I don’t have enough cred myself to kick the door open for other people. I don’t think I have to explain the irony.

Anyway, I guess this post isn’t really for those people anyway. It’s for my larval-stage readers, who are the worst affected by this phenomenon. I just want to make you aware of this message that’s constantly being thrown at you, and let you know it’s total bull.

Make stuff. Learn things. Help people.

You’re a hacker. It’s what you do.

System update: AAAAAAAAAAA <3

AAAAAAAAAAAAAAAAAAA!!!

I have Linux back!

I missed you, my darling. Win10 is just tolerable, but you’re beautiful. It took me an hour or two to figure out how to release you from the clutches of secure boot, fast startup, and proprietary drivers–mostly the first one, as you handled the NVIDIA drivers exceptionally well this time around–and it was worth it.

Ugh it feels so good to be back. It’s so much more… comfortable.

Love letters aside, I now have Mint “Sylvia” and Cinnamon running on the lion’s share of my terabyte drive. Win10 is still on the other 400GB or whatever I gave it, mostly because it runs Adobe Suite for my graphic design class (which is also why it has that much of the drive; those files can end up pretty big).

I don’t know if this is a difference between Debian and Mint, but Mint handled the NVIDIA driver installation way more smoothly. I didn’t have to open xorg.conf or anything. One of the CS professors at my university likes to joke that Ubuntu is an African word meaning “can’t configure Debian.” Well, in this case, I suppose he’s right. I spent somewhere between 6-8 hours trying to configure the darn thing, and I’ve been using Linux off and on for years, the off parts mostly being OS X, which still has a Unixy structure you can fiddle around with to some degree.

So, what prompted me to try again? Two things.

One, I want to pick up my Clojure book again at some point soon, and I absolutely refuse to program using the Windows command line. It feels… weird. As if you’re trying to converse with something that should be intelligent, but not only is it eerily robotic and kind of oblivious, there’s a language barrier and only half your words are understood. Feels icky.

Web dev is fine on Windows because you don’t have to touch the OS anyway, everything’s in the browser and text editor. But my workflow for basically any other kind of coding has relied so heavily on the command line for years that I don’t actually know how to use anything else any more. IDEs? What are those?*

And two, I’ve been drooling over the idea of getting it back for a day or two since I answered a reader’s question. I remembered that Debian isn’t the only system I like, and in reminding my fellow (larval?) hacker that Mint is easier on a newbie, I stopped to consider that Mint might be able to handle my drivers better. And it did! Again, I don’t know if this is a Mint thing or a time thing–it could be a time thing, because I do remember trying to use a Debian utility to install the drivers for me, and I’ve given the hardware a few months to age–but the problem I had is magically gone.

I’m not going to question it. Everything is working. Linux is magical. Mint is my Valentine this year.

 

Speaking of which, I spent Valentine’s Day playing Doki Doki Literature Club. That’s what single nerd girls do on Valentine’s Day, right? Play psychological horror games disguised as anime waifu dating simulators, and set up dual-boot Linux? Well, this one does!

Do heed the content warnings. The game is very graphic, overtly emotionally manipulative, does a pretty good job of portraying mental illness at its most horrifying, and just because it doesn’t bother me definitely does not mean that other people with depression will also be okay with it. People without depression have a solid chance of not being okay with it. This game absolutely will do horrible things to nice people and try its best to make you feel terrible about your part in the events. (I am not easily persuaded on this front. I thought it was pretty funny, actually.)

If you’re not sure about whether you should play, either don’t, or watch someone else play it on YouTube. That way you’re not responsible for any of the decisions, so the game can’t guilt you about them. I did this with OFF after some friends told me it was really messed up, but I didn’t think it was that bad. Doki Doki Literature Club is. Actually, it’s everything I wanted out of OFF, flavored with deconstructed anime tropes and fourth-wall-breaking creepiness. It’s a work of art. Do not get attached.

 

Well, I’ve got other stuff to do (like… sleeping…) so I’m going to leave this here for now. Just a small update really, and not very helpful, unless perhaps you’re dealing with NVIDIA driver issues yourself…?

Happy hacking!

 


*Irritating, that’s what. Don’t let me dissuade you if that’s what you’ve been using, though–I’m not the elitist type who’ll say you can’t use them because Command Line Is Always Better!!!!!!!–I just really hate having to learn how IDEs work, or juggling one for every language. Honestly, the reason my default is a really simple code editor (nano, Sublime, gedit, etc) + command line is that I’m too lazy to learn all the integrated systems. I’ve procrastinated even picking up Atom because it does more than, like, text highlighting. I use Sublime sometimes and I know it can do more, but I’ve never really used those features. This goes for the traditional nerd systems as well: I can’t stand vim or emacs. Maybe one day an IDE will melt my jaded nerd heart, or I’ll get a job where I need to use one (in which case its features might actually be useful), but until then… I just don’t need that kind of complication in my life.

Hacking for Designers #1: HTML

(Apologies to David Kadavy. I stole his title. :P)

A note: “Hacking” among programmers just means you’re building something new, usually with code (but not always!). It’s not the kind where you just type a bunch and then say “I’m in.” Programmers refer to those people as “crackers,” and they don’t think very much of them; partly because it doesn’t actually take that much intelligence to break into stuff, and partly because what kind of uncreative programmer can’t find something better to do? Honestly. The exception is of course pentesters, or “white hats,” whose job it is to try to break into systems to find out where they’re vulnerable and secure those vulnerabilities. Pentesters are quite highly regarded. Developing security and finding obscure weaknesses to shield is much more difficult than what most crackers do, and much more valuable.

So, you have the ability to make stuff look awesome through design. It’s important! We need people like you, because otherwise technology would be a royal pain in the butt to use. Design is the reason Apple succeeds even though they don’t always have the best tech, and lack of it is why it took Linux forever to get off the ground as a user’s OS despite being awesome at the code level.

But not being able to code at all can be a bit… limiting. There’s a learning curve, yes, but fortunately the basics aren’t anywhere near as intimidating as they look. You’re definitely smart enough to pick this up. The main barrier is how scary and weird code can look to a neophyte.

The good news is that hacking is a lot more similar to your own field than you think! People draw a dichotomy between software devs and designers like they’re total opposites. Even geographically, our classes are on the opposite side of campus next to the physics and chemistry departments (here at UNI). But this is super weird! We have zero to do with science. They call us computer scientists, but that’s just to make us look fancy. What we are is makers, and the way we learn and the way we work are much more similar to painters and writers and… designers!

So, I’m whipping up a quick guide for you last minute. It’s really not going to take you long to pick this stuff up. Coding doesn’t get difficult until you’re deeper into the field, and if you get that far, you’re probably addicted so it doesn’t matter 😉

This only covers web development. There are lots of other specialties in the field, but this is the one that’s most useful to you.

First Stop on the Nerd Train: HTML

You’re gonna need this. Pure HTML is butt-ugly; you need CSS to do any kind of layouts or colors. But you can’t learn CSS until you learn some basic HTML, because you need something to style. Trying to learn CSS first would be like trying to format a book without the actual text.

HTML is for content and structure, CSS is for layout and styling. Don’t fall into the trap of using tables for layout. People did that a lot in the 90s, and it makes your site ridiculously difficult to code and maintain. Also it’s likely to play merry hells with responsive design these days. Fortunately, there are a few newer tools that fix this problem and let you use grid layouts without getting into the whole <table> mess. We’ll get there. HTML first. Just wanted to warn you off this pitfall in case you see it somewhere else.

First off, you’re gonna need a code editor. Right now, the hacker favorite for web dev is Brackets. It’s free, it’s open-source, and you can download it here.

Got that?

Okay, so here’s the basic skeleton for an HTML5 project. Copypaste at will; I’ll explain what all this does in a minute.

<!DOCTYPE html>
<html>
    <head>
        <title>Welcome to Website!</title>
        <meta charset="UTF-8">
        <link rel="stylesheet" type="text/css" href="thisisyourcssfile.css" />
    </head>
    <body>
    </body>
</html>

If you save this as an HTML file and run it in Brackets’ live preview (use the lightning bolt button on the top of the right sidebar), you’re going to see a blank white page. That’s normal! You should also see “Welcome to Website!” in the tab’s label in your browser. That’s because I set it as a placeholder title; you can put whatever you want in there.

About those indents–they’re not strictly necessary in HTML, but they make your code way easier to read! You could put everything on one line if you really wanted, but it would be a pain.

A standard indent is four spaces. However, Brackets does this neat little trick where it turns a tab keypress into four spaces, so you don’t have to wear your thumbs out!

I think the only thing I still haven’t explained is the <!DOCTYPE html> thing. That just tells the browser you’re using HTML5. Nothing weird!

Wait, How Does Any Of This Work?

Skip if you already know about how browsers and servers interact. If not, here’s a quick explanation.

A “live” site (one that can be accessed from anyone’s browser, like google.com) lives on a server. That’s a computer, somewhere, which can talk to other computers and knows how to hand out web pages when other computers ask for them.

So when you type a URL into your browser (e.g., Firefox, Chrome, etc), it uses your Internet connection to go ask that server for the files that make up the web site you want. Usually this means an HTML file, a CSS file (or possibly more than one), a handful of images, and probably some scripts (programs) too, especially if the web site is one you can post stuff on and interact with rather than just read and browse.

The URL you type in is pointing to a location and filename on the server. If you just type the domain name, like somethingsomething.com, then what the server will return is the index.html file (and anything linked to it, like stylesheets and images). You’d still get the same page if you typed in somethingsomething.com/index.html because it’s the same thing. So, as a web developer, you should always name your homepage index.html so the browser can find it.

All those files are basically just instructions for your browser, telling it how to display all the stuff on the web page and what should happen when you click on stuff. Sometimes you enter text into forms and whatnot, and your browser can send that back to the server, asking for whatever response to that data is appropriate.

For now though, we’ll start with static web pages: that is, pages that are just HTML, CSS, and images. If you need more interactivity, you’ll have to either learn a programming language (JavaScript, Python, and Ruby are very reasonable choices for this), or find and work with someone who does. I encourage you to try learning to code, though! Picking up the basics is pretty easy and lets you do more, plus it looks fantastic on your resume.

Adding Stuff: How Elements Work

Most of your other code is going to go in between the <body> tags. That’s where you put all your content. <head> is a separate element. It isn’t the page header! That’s something different, and you’ll put it in the <body> tags with the rest of your content. <head> is only for metadata, which is stuff that you put in to tell browsers and search engines stuff about your site, like what character set to use, keywords to help searches find your site, and how to find your CSS file.

So, what can you put in the <body>?

Elements!

Here’s the basic structure of most elements: an opening tag, like <p>, then some content, like some text or images, then a closing tag, like </p>. Closing tags have that slash before the letters. It tells the browser that that’s the end of the element (the end of the paragraph, in this case). There are a few elements that only need one tag, like the charset declaration in the head, or image elements, but most need two!

You can have elements inside elements, and that works like this:

<p><a href=”heylookalink.com”>Here’s a link!</a></p>

You never do it like this:

<p><a href=”heylookalink.com”>Here’s a link!</p></a>

Always nest elements inside each other, don’t let them cross through each other like that!

Meet Some Elements

Here are a few elements to get you started. There are a lot of elements, and you can find out how all of them work from w3cschools.com, but here are the ones you probably want at first. They’re all linked to their respective W3C Schools pages, so just go there to find out how they’re used. Everything has examples and stuff!

Divs and other style-related dividers

<header> and other semantic elements

These elements are just boxes to put your stuff in so you can style them up later. Like, <header> is for the header in your design, that shows up at the top of the page. It’s an invisible box at first, but you can use CSS to paint the box different colors, or change its size, add borders, stuff like that. Go to the semantic elements link for the whole thing, there’s a bunch like this: <nav>, <article>, <section>, <aside>, <footer>, and so on.

<div>

These are like the semantic elements, but they’re custom! You can give them classes, for when you have a bunch of divs to style the same way, like <div class=”product”>, or you can give them IDs, for when you want there to be only one div in that style, like <div id=”left-sidebar”>. (We’ll talk more about classes and IDs when we get into CSS.) Divs used to be the only kind of “styling boxes” available, but HTML5 introduced the semantic elements above.

Text formatting elements

<h1> <h2> <h3> … <h6>

These are also referred to as header elements, but they’re not just boxes. You put text inside, and the browser recognizes that it’s a header and styles it accordingly. By default, h1 is huge and h6 is pretty small (but still bold, so it’s different than other text). Here’s what my blog’s styling turns them into:

h1

h2

h3

h4

h5
h6

<p>

Paragraph tag. Pretty straightforward!

<ol><ul><li>

1. 2. 3. “ordered” lists, bullet-point “unordered” lists, and the list items you put in them, respectively

<a>

Links (they need an href=”linkurl.com” to work tho, so your browser knows where it’s supposed to go)

Other stuff

<!– comments –>

This is an important tag, and one you’ll probably want to use a lot, especially at first. It’s a special one: it lets you make notes to yourself or other developers. Anything inside a comment tag is totally ignored by computers reading your code.

It’s an unusual looking tag, so here’s how it works.

<!-- blaaaaaah it doesn't matter what I put here! the browser doesn't care as long as it's between these tags! -->

You can use comments to warn future readers about some weird hack you put in your code, or you can use them as a to-do list, or you can use them to remind yourself how something works. Or you can put dorky ASCII art or silly jokes in it, so if someone comes along and decides to read your code, it’s like an Easter egg for nerds. There’s a proud tradition of leaving silly comments in source code.

Occasionally you’ll hear developers brag that no one should need comments because code should be written so it’s “self-documenting”; in other words, one should master the art of precise and meaningful names for variables, classes, etc. This is bullshit: sometimes you just need comments, and usually the developer making said claim is NOT one whose code is “self-documenting.” It’s just an excuse for them to be lazy and not write comments. You’ll find that programmers are very lazy–especially the ones who are either very bad at programming, or very very good.

Either way, just go ahead and write your comments.

<hr>

This makes a line across the page. Sounds weird, but sometimes you need it.

<meta>

Meta tags are how you tell other machines things about your code. They always go in between the <head> tags, not the <body>. You can tell your browser what character set to use (this gets important when your site has international visitors). You can tell search engines who the author of the page is. You can give search engines information about what’s on the page by offering it keywords. You can also tell your browser to notice how wide the device is, and change the page’s formatting accordingly–this will get more important as you get into responsive design.

None of the stuff in <meta> will actually show up on your page, but other machines will be able to read it. The only way your site visitors will see any of it is if they use their browser’s developer tools to manually look through your source code. Also, <meta> are single-tag elements. You’ll never need to type </meta>.

If this element sounds like a weird grab bag of semi-related stuff, it kind of is. W3C can elaborate on the stuff you can do with <meta>. At some point, you’ll need to know about what <meta> can do, but for now it’s okay to skip it and come back later.

<!DOCTYPE html>

That tag is called a doctype declaration. Browsers will generally still render stuff just fine without it as long as HTML5 is current, but as the web page gets older and the standards change, it might start to get buggy unless the browser has a doctype declaration to tell it which standards you’re using in your code. The way we code for the web changes over the years.

This tag has to be the very first line of your HTML file. Don’t try to put it anywhere else.

 

 

 

I’m still working on this post, as it’s going to take a bit to cover everything you’ll need to get started, but I’m confident I can get it down to a manageable and relatively non-intimidating post. Yes, I’m publishing it before it’s 100% finished. A blog post isn’t a book, and nothing that’s going to be here should make the post misleading due to its initial omission. In other words, all the chunks of info on here should be fairly self-contained. (It’s 3AM and I’m getting sesquipedalian wordy because I’m tired, so I’m going to stop for now. Isn’t that a weird habit?)

Happy hacking!

Here are the new toys I’m playing with right now

So, you might remember that I’m on a web design kick right now. Right now I’ve picked up two or three new things (edit after writing the rest of the post: hahahahahaha) and I’m learning them all at once, because that’s just kind of how I work. Though, this is more sensible than that time in high school I took my fourth year of French and my first year of German at the same time and kept mixing the two up until I forgot how to speak English because my thoughts were running the mutant child of the foreign languages I was learning, haha.

jQuery: About time I got around to some kind of JS framework. I don’t actually know much JavaScript yet but you don’t need to for the course, and anyway it’s not like picking up a new code language is that much of a stretch after your first one or two. Here’s the free video course I’ve been watching. You don’t need to know anything more than basic HTML/CSS as a prerequisite to that course, and the guy is friendly and not condescending, so go for it. It’s really light and easy, you can watch it while eating lunch or whatever.

CSS Grid: I’m. Fangirling. Over this code design. It’s so sensible. CSS has featured a distinct and glaring lack of any reasonable layout system for years. I am definitely not the only one who’s spent two hours trying to get floats to cooperate. Remember when people tried to use tables as layout structures? Ever try to develop on top of one of those? Hahahahahahaha. It was not fun.

Grid is basically the layout system from Kivy. You do most of your layout in CSS; the actual order of the elements you list in your HTML document can be completely overridden if you like (but there are some ways that Grid makes use of them if you let it). Every programmer who likes to prototype and redesign and tweak major design components in code, rather than designing in Mockplus or whatever first, is going to absolutely love Grid.

It’s supported almost everywhere. The holdout is of course Internet Exploder Microsoft Edge, but the dev team for that has announced they’re working on it. Oh, and apparently mobile Opera doesn’t like Grid either, but… I didn’t even know you could download Opera on a phone??? Who’s using that?

Anyway once it’s supported everywhere, the standards will solidify a bit and we can use it for important stuff. Betcha all the techie companies are going to immediately be after devs who can use it. There are probably some good jobs on the not so distant horizon, folks.

But even if not… the API is so beautiful it makes me want to cry. You should read it just for the soul soothing qualities of a sensible design.

Mockplus: I’m taking a course that’s about designing for web. An actual college course. Thing is, my classmates are all designers, not devs… next class we’re going over the basics of HTML. If we didn’t have a strict attendance policy I’d be tempted to skip. I’d probably still go though because I like teaching people stuff. Bluh, I sound so arrogant. You know what I mean, they just have different skill sets than I do. They can actually use Adobe products without the constant hiss-and-spit dynamics I’ve always enjoyed. But I digress.

Prof asked us to do some wireframes the other week, and I pulled up the free version of Mockplus and figured out how to use the basic stuff. I didn’t need the actual interactive design features it has, just the drag and drop bits, so this barely makes the “learning something” list. Feels like putting Microsoft Office on your resume. Prof freaked out over how detailed everything was, thinking it was going to take me forever… would have taken too long to explain that programmers are waaaaay too lazy to draw out vector graphics crap for the same basic design elements over and over, so it’s drag and drop software. She really didn’t want to hear about my weird nerd program, haha. It looks way more technical than it is though. Give it a shot, it’s a nice way to play with more complicated designs without fiddling around in the code, wasting loads of paper, or trying to get an Adobe product to behave.

Design For Hackers: I’m a couple chapters into this and so far it’s gone over kind of a lot of stuff I knew, but it’s still interesting. If you’re not already a weird designer/programmer hybrid like I am, it’d be a valuable book to have as a visual design guide. If you think visual design isn’t important, I’ll ask you: why did Windows take off and not Linux? Why is Linux becoming more popular now? Good visual design makes a better and friendlier user experience. Don’t sneer at designers. Their website designs may require someone with coding skills in order to exist meaningfully, but great code with an awful interface isn’t much better because no one wants to use it.

A Practical Guide To Designing For The Web: This is a free ebook and you can get it here. The bottom of that page hosts the ebook download links, or you can just read it in your browser. I imported it into the Kindle app on my Android phone for easier reading. I’m only 4% into it but I’d say that at the very least it’s worth flipping around in.

The Tangled Web: I was looking for a book that would discuss how to develop secure web sites, and this came up over and over and over again. Then I saw it was a No Starch book and I instantly wanted it. No Starch is up there with O’Reilly, for me. It’s slow reading and quite technical, but it’s interesting. It… hasn’t actually gone over much code yet, a lot of historical discussion of how the back end of the Internet developed (get your brain out of the gutter) and the inherent security issues caused by the old browser wars.

It was hard enough to find a book on “web security” that wasn’t about breaking into stuff -_- le sigh. So, I’m hoping No Starch won’t let me down here and there’ll be some practical advice on how to deal with security issues later on in the book.

 

Yeah, I know, I can’t stick to just one book at a time. I’d probably be farther into each of these if I were. And my old Clojure book is still calling to me, reminding me that it’s been well over a year and I still haven’t gotten into it… so tempting. But. I’ll curb my spacey, dabbler tendencies for now and wait until I’m more solid on my web stuff.

Also, yes, it’s 4AM as I’m writing this. My sleep pattern is kind of jacked up right now–I clocked out at 8PM yesterday and woke up at midnight. Then I read bits of DfH and PGtDFtW, wandered over to the computer and watched some jQuery videos, thought about compiling a list of resources for my designer classmates (a handful of my regular hacker readers might not have gotten into web stuff yet anyway, so it’d likely be useful for you folks too), wandered over here, answered a question in my typical long winded fashion, and wound up writing this bizarre list. To be honest this is the first time I realized I was reading/learning/using so many different things at once… whoops. When did that happen? Wait, has it been a month since I started really digging into this? When did that happen? (Spoiler alert: a month ago.)

Okay, uh, I’m getting tired and silly again so I should probably go back to bed. I’ll leave you with yet another list-of-resources post (are you guys getting tired of these? I think I do these a lot?) and say:

Happy hacking!

(And good night.)

On Debian and Mint, and why I like them

I got carried away answering a comment again, but I’m going to leave my reply intact and just make it a post, because I think this question and answer are addressing a barrier to entry point in the Linux world–namely, choosing a distro. This isn’t an attempt to actually answer the whole question, it just explains some opinions.

TL;DR: If you’re new and you’re picking your first distro, I suggest Ubuntu-based Mint (not Debian-based because you have to pick through a lot of Ubuntu Mint docs for it to find the right stuff). Debian is well-loved by a lot of programmers because the design is clean and easy to build on and customize until you’ve worn your own groove into it, so if you’re not new and you’re playing around with distros, try Debian if you haven’t already–it’s a classic.

 

kirisky asked:

Hi, Rebekah!
May I know why you like Debian?

Sure thing! Debian is well-loved because it’s solid. It’s sensible. It doesn’t include weird design decisions (unless you count the more recent versions of GNOME, the window manager, which some people don’t like–but you can always just install Cinnamon instead and use that. More on window managers in a minute). Both Debian and my other favorite, Mint, have good package managers, they run a lot of stuff natively, they’re easy to debug, they’re comfy to code on, they’re well maintained and documented… just overall they’re well kept and pleasant to use.

To a geek, Debian is the epitome of “we’re just gonna let you do your own thing.” Debian provides a solid base for whatever workflow customization you like. Even if you don’t care to change things, Debian has this clean, restrained design that’s pleasant to use. I’m using language that sounds like graphic design, because I’ve been hanging around designers lately, but I’m actually not talking about visual design.

I’d describe Mint as “friendly.” It’s a comfortable operating system. It’s like the friend who, you show up at their house and they’re wearing a clean but well-loved pair of jeans and an old band T-shirt, and they have chocolate chip cookies. Maybe the house is a little cluttered, but it feels lived-in.

If you’re asking because you want to try Linux for the first time, I’d recommend Mint. Debian is a very geek-oriented system, and sometimes Mint can be a little easier to handle because it’s specifically designed for its user friendliness. Debian is really nice for programmers though.

I’ll also warn you that there’s a third operating system called Debian Mint. Most people, when they talk about “Mint,” mean Ubuntu-based Mint. Debian Mint is a hybrid kind of system where people have taken Mint and tried to take it back to its Debian roots. I know, sounds confusing. Here’s the history.

Ubuntu is based off Debian. It was, and is, a very successful effort to make Debian more approachable to total newbies. Some of the design decisions in Ubuntu weren’t too well liked by the older programmers (well, you can’t please everyone), so they kept using Debian, which is of course still maintained as its own thing. Mint is based off Ubuntu, and it’s just generally well-loved–I’ve never heard anyone rag on Mint. But people have tried to scale back some of the design characteristics it inherited from Ubuntu. Personally, I’m not exactly clear what those are, but it resulted in Debian Mint. From what I understand, it’s stable, and I think I tried it on a VM at one point? But I wouldn’t recommend it as your first, for the simple reason that trying to find *Debian* Mint documentation among all the *Ubuntu* Mint stuff is kind of a pain when you’re trying to set things up. Sure, some of the Ubuntu Mint stuff will work for Debian Mint, but it’s hard to tell the difference between the stuff that will and won’t work immediately, especially if you’re new.

If you’re confused about the term “window manager,” let me clear it up here. The window manager is basically every part of the Linux user interface that ISN’T the command line. Linux can totally be run just via the command line, and you can do some basic stuff like edit text files and change configuration settings and even write programs without ever booting up to a graphic interface. This is because, unlike Windows, the window manager is a totally separate program! So you can pick one you like. There are window managers that put everything in tiles, so instead of a desktop you have to learn to use some keyboard commands to bring up and position stuff. There are window managers that look basically like Windows 7 but cleaner; Cinnamon is one of the nicer ones. Here’s what it looks like as of May 2017 (image is a link to the post it came from). Isn’t it nice looking? It’s got a search bar. Those buttons on the side of the menu are customizable, if I remember right; you can put your favorite stuff in there.

Ugh, I’m drooling, this is making me want to go back to Mint. I wonder if it’d handle my stupid NVIDIA graphics card better than Debian did? Maybe. (Long story. Not really Debian’s fault it can’t handle the extremely odd hoops NVIDIA makes you jump through to make its proprietary drivers work. I bought too new of a card, so the open source drivers were still crappy. Anyway.)

So, I hope that answers your question, kirisky, and hopefully someone else’s too.

Happy hacking!