Quantcast
Channel: Hacker News 50
Viewing all 9315 articles
Browse latest View live

The Technium: Pain of the New

$
0
0

Comments:"The Technium: Pain of the New"

URL:http://www.kk.org/thetechnium/archives/2013/01/pain_of_the_new


New media technologies often cause an allergic reaction when they first appear. We may find them painful before we find them indispensable.

I watched the movie The Hobbit. Twice. First I saw it in its "standard" mode. A day later I returned to see The Hobbit in 3D at a high frame rate of 48 frames per second, called HFR. HFR is a cinematic hi-tech that promises greater realism. It was amazingly real. And disturbing at first.

Because 48 frames per second is just above the threshold that a human eye/brain can detect changes, the projected picture seems startling whole and "smooth," as if it were uninterrupted reality.

I was surprised though that the movie in 48HFR looked so different. (The 3D did not have an effect.) Even though both formats were shot with the same cameras and lighting, they appeared to be lighted and shot on different sets. The HFR lighting in the HFR movies seemed harsh, brighter, and more noticeable. The emotional effect of HFR was disturbing for the first 10 minutes. And perplexing -- because the only thing different in the two movies was that one was displayed in the 48 frames it was shot at, and the other was computationally reduced down to the normal 24 frames per second. Why would the frame rate distort the lighting and the emotion?

I was not the only one who noticed. The HFR version of the Hobbit -- the first commercial movie to be released in this new format -- stirred up howls from the critics. Very few filmish people liked what they saw. For most it was painful. The reviewers struggle to express what HFR looked like and why:

"Audiences looking for a rich, textured, cinematic experience will be put off and disconcerted by an image that looks more like an advanced version of high definition television than a traditional movie." – Kenneth Turan, L.A. Times "One thing The Hobbit is not is a celebration of the beauty of film. A celebration of video-game realms, perhaps." – Steven Rea, Philadelphia Inquirer

All kinds of ailments were ascribed to it, including hard of hearing:

"I can honestly say I had a harder time hearing some of the dialogue in the 3D HFR version than in the 2D… It was like watching really, really, really atrociously bad state run TV show……High frame rates belong on bad TV shows and perhaps sports." -- Vincent Laforet, Gizmodo My first impression, too, was that HFR reminded me of my first look at video. That theme was repeated by many. But what is it about video that we didn't like at first? "Those high frame rates are great for reality television, and we accept them because we know these things are real. We’re always going to associate high frame rates with something that’s not acted, and our brains are always going to associate low frame rates with something that is not. If they’re seeing something artificial and it starts to approach something looking real, they begin to inherently psychologically reject it." -- James Kerwin, Movieline "Instead of the romantic illusion of film, we see the sets and makeup for what they are. The effect is like stepping into a diorama alongside the actors, which is not as pleasant as it might sound… Never bet against innovation, but this debut does not promise great things to come." – C. Covert, Minneapolis Star Tribune

What's going on here? I really struggled to figure out what was happening to my own eyes and my perception that something as simple as changing a frame rate would trigger such drastic re-evaluations of cinema?

I researched on the web without much satisfaction, since few people had actually seen 48HFR. I asked a few friends in the advance cinema industry and got unsatisfactory answers. Then I was at a party with a friend from Pixar and asked him my question: why does HFR change the appearance of the lighting? He also could not tell me, but the man next to him could. He was John Knoll, the co-creator of Photoshop and the Oscar-winning Visual Effects Director for a string of technically innovative Hollywood blockbusters as long as my arm. He knew. I'll put his answer into my own words:

Imagine you had the lucky privilege to be invited by Peter Jackson onto the set of the Hobbit. You were standing right off to the side while they filmed Bilbo Baggins in his cute hobbit home. Standing there on the set you would notice the incredibly harsh lighting pouring down on Bilbo's figure. It would be obviously fake. And you would see the makeup on Bilbo's in the harsh light. The text-book reason filmmakers add makeup to actors and then light them brightly is that film is not as sensitive as the human eye, so these aids compensated for the film's deficiencies of being insensitive to low light and needing the extra contrast provided by makeup. These fakeries were added to "correct" film so it seemed more like we saw. But now that 48HFR and hi-definition video mimic our eyes better, it's like we are standing on the set, and we suddenly notice the artifice of the previously needed aids. When we view the video in "standard" format, the lighting correctly compensates, but when we see it in high frame rate, we see the artifice of the lighting as if we were standing there on the set.

Knoll asked me, "You probably only noticed the odd lighting in the interior scenes, not in the outdoors scenes, right?" And once he asked it this way, I realize he was right. The scenes in the HFR version that seemed odd were all inside. The landscape scenes were stunning in a good way. "That's because they didn't have to light the outside; the real lighting is all that was needed, so nothing seemed amiss."

Now some of the complaints make sense:

"While striking in some of the big spectacle scenes, predominantly looked like ultra-vivid television video, paradoxically lending the film an oddly theatrical look, especially in the cramped interior scenes in Bilbo Baggins' home." – Todd McCarthy, The Hollywood Reporter "Instead of feeling like we've been transported to Middle-earth, it's as if we've dropped in on Jackson's New Zealand set..." – Scott Foundas, Village Voice

As digital recording continue to increase in resolution, fluidity, and sensitivity, this verisimilitude with "being in the set" will also increase. John Knoll wisely predicts that his industry will quickly learn that they have to abandoned the old style of lighting, and also increase the realism in such things as props and special effects. "I liked the HRF version," he said. "We are going to see a lot more of it."

But that is not what the filmish people want. They like the less sensitive, blurry style of film better. One critic even suggested that directors should put a soft-focus filters to debase the clarity of the new digital recordings and restore the "painterly" aspect of classic films.

"Over all, though, the shiny hyper-reality robs Middle-earth of some of its misty, archaic atmosphere, turning it into a gaudy high-definition tourist attraction." – A.O.Scott, The New York Times "At 48 frames, the film is more true to life, sometimes feeling so intimate it's like watching live theater. That close-up perspective also brings out the fakery of movies. Sets and props look like phony stage trappings at times, the crystal pictures bleaching away the painterly quality of traditional film. Like the warmth of analog vinyl vs. the precision of digital music, the dreaminess of traditional film vs. the crispness of high-frame rates will be a matter of taste." – Associated Press

I told Knoll that these complaints about the sterility of the new digital format reminded me of the arguments against CD music albums. Digital was "too clear" "too clinical" not "warm and fuzzy enough" according to audiophiles. CDs missed the musical ambiance, the painterly soul of a song. The critics were not going to buy CDs and the labels would have to pry their beloved analog vinyl albums from their dead hands. Of course, for average music fans, the clear hiss-free quality of CDs were soon perceived as much superior, particularly as the "frame" rate of the digital sampling increased past the point of most ear's perception. "That's exactly what it is like, " exclaimed Knoll. HFR is the CD of movies right now.

This pattern of initial irritation followed by embrace has been found in other media introductions. When the realism of photography first appeared, artists favored soft lenses to keep the photos "painterly." Drastic sharpness was startling, "unnatural" to art, and looked odd. Over time of course, the sharp details became the main point of photography.

Color TV, technicolor, and Kodakchrome all had its detractors who found a purity and monumentalism in black and white. Color was all too gaudy, distracting and touristy, not unlike the criticism of HFR now.

I predict that on each step towards increased realism new media take, there will be those who find the step physically painful. It will hurt their eyes, ears, nose, touch,and peace of mind. It will seem unnecessarily raw, ruining the art behind the work. This disturbance is not entirely in our heads, because we train our bodies to react to media, and when it changes, it FEELS different. There may be moments of uncomfort.

But in the end we tend to crave the realism -- when it has been mastered -- and will make our home in it.

The scratchy sound of vinyl, the soft focus of a Kodak Brownie, and the flickers of a 24 frame per second movie will all be used to time-stamp a work of nostalgia.


Go, the language for emulators | Dave Cheney

$
0
0

Comments:"Go, the language for emulators | Dave Cheney"

URL:http://dave.cheney.net/2013/01/09/go-the-language-for-emulators


So, I hear you like emulators. It turns out that Go is a great language for writing retro-computing emulators. Here are the ones that I have tried so far:

I really liked this one because it avoids the quagmire of OpenGL or SDL dependencies and runs in your web browser. I had a little trouble getting it going so if you run into problems remember to execute the trs80 command in the source directory itself. If you’ve used go get github.com/lkesteloot/trs80 then it will be $GOPATH/src/github.com/lkesteloot/trs80.

GoSpeccy was the first emulator written in Go that I am aware of, Andrea has been quietly hacking away well before Go hit 1.0. I’ve even been able to get GoSpeccy running on a Raspberry Pi, X forwarded back to my laptop. Here is a screenshot running the Fire104b intro by Andrew Gerrand

Like GoSpeccy, Fergulator shows the power of Go as a language for writing complex emulators, and the power of go get to handle packages with complex dependencies. Here are the two commands that took me from having no NES emulation on my laptop, to full NES emulation on my laptop.

lucky(~) sudo apt-get install libsdl1.2-dev libsdl-gfx1.2-dev libsdl-image1.2-dev libglew1.6-dev libxrandr-dev
lucky(~) % go get github.com/scottferg/Fergulator

What’s this? Another emulator for Andrea Fazzi ? Why, yes it is. Again, super easy to install with go get -v github.com/remogatto/sms. Sadly there are no sample roms included with sms due to copyright restrictions, so no screenshot. Update: Andrea has included an open source ROM so we can have a screenshot.

Update: Several Gophers from the wonderful Go+ community commented that there are still more emulators that I haven’t mentioned.

This entry was posted in Go, Programming and tagged emulation, retrocomputing on January 9, 2013 by Dave Cheney.

Create and delete branches · GitHub Blog

$
0
0

Comments:"Create and delete branches · GitHub Blog"

URL:https://github.com/blog/1377-create-and-delete-branches


Now you can create and delete branches from GitHub.com.

Create a branch

In your repository's branch selector, just start typing a new branch name. We'll give you the option to create a new branch:

We'll branch off of your current context. For example, if you're on the bugfix branch, we'll create a new branch from bugfix instead of master. Looking at a commit or a tag instead? We'll branch your code from that specific revision.

Delete a branch

You'll also see a delete button in your repository's Branches page:

As an added bonus, we'll also give you a link to the branch's Pull Request, if it has one.

Collaborate without the terminal

For small changes like documentation fixes, typos, or if you're just a walking software compiler, you can get a lot done in your browser without needing to clone the entire repository to your computer:

Happy branching!

Have feedback? Let @github know on Twitter

We make sure to read every mention on Twitter. If you find a bug, submit it to support@github.com. Every email is read by a real person.

php.internals: Was Reflection annotations reader - We Need A Vision

Haskell Lectures - CS 1501

$
0
0

Comments:"Haskell Lectures - CS 1501"

URL:http://shuklan.com/haskell


University of Virginia CS 1501 Lectures Spring 2013

Lecture 1

Outlines class structure, syllabus, grading policies, and reference text. Introduces the origin and theory behind Haskell.

Lecture 2

Coming soon (Jan 22, 2013)

Lecture 3

Coming soon (Jan 29, 2013)

Lecture 4

Coming soon (Feb 5, 2013)

Lecture 5

Coming soon (Feb 12, 2013)

Lecture 6

Coming soon (Feb 19, 2013)

Lecture 7

Coming soon (Feb 26, 2013)

Lecture 8

Coming soon (Mar 5, 2013)

Lecture 9

Coming soon

Lecture 10

Coming soon

Lecture 11

Coming soon

Lecture 12

Coming soon

VCs Think My Boobs Need An Algorithm « OS Fashion | Open Source Fashion

$
0
0

Comments:"VCs Think My Boobs Need An Algorithm « OS Fashion | Open Source Fashion"

URL:http://www.os-fashion.com/vcs-think-my-boobs-need-an-algorithm/


Posted by Sindhya Valloppillil on January 9, 2013

Opinions held by the contributor do not necessarily reflect the opinions of OS Fashion and it’s members.

VCs think my boobs need an algorithm. My boobs don’t need an algorithm. If that’s not enough, VCs also think that women need a bra subscription. They gave $2M in seed funding to True & Co., an e-commerce bra company with a algorithm and subscription model. Never mind that the clear majority of women don’t buy bras every month. This start-up’s algorithm involves answering questions online for about 3 minutes that’s not only boring and painful but also futile. The algorithm, like the brand’s name, is ridiculous. An algorithm cannot provide you with a better fit just as answering questions online cannot help you find the best pillow for your preferences. Some products need to be touched and tried on. An algorithm cannot account for technological advancements like soft stretch in bra straps, seamless fits, softer lace with stretch, and good quality padding that isn’t cheap and itchy. Finally, as a lingerie brand, this start-up lacks fun and sexy branding. There’s a place for an algorithm–it isn’t my bra. VCs simply don’t understand consumer psychology, consumer purchasing patterns and what it takes to build a great brand or product. It seems as if they think consumer tech is easy and that anyone can do it. This misunderstanding is a big problem, and VCs are screwing up the ecosystem.

Charlie O’Donnell (@ceonyc), a VC at Brooklyn Ventures, recently tweeted in reply to Sanjay Raman (@sanjayraman), a VC at Greylock Ventures:

“@sanjayraman: Always fascinated by eBay dichotomy: terrible product, yet remarkable liquidity.” << Like a bar full of drunk singles at 3am. — Charlie O’Donnell (@ceonyc) December 25, 2012  

There are companies that manage to produce some liquidity because some sh*t will stick to a wall. But it’s still sh*t. The only reason they are around is because their products aren’t grossly terrible, but that doesn’t mean the brand, its product or its user experience are great. Better start-ups exist which haven’t gotten funding yet as they are not VC cronies. VCs give the founders of those better start-ups a hard time for not having enough traction and being first-time entrepreneurs while they fund start-ups with founders who have illogical business models. For example: where are the panties in True & Co.’s business model? True & Co. launched in May yet they are still not selling any panties to go with the bras they sell. For every bra a woman buys, she probably buys at least 5 pairs of panties. Also, panties have much better margins than bras. This is basic business strategy that has evaded the founders of True & Co. as well as the VCs who backed them.

Struggling start-ups which VCs previously had gone nuts over and rewarded with gross amounts of funding include: Dollar Shave Club ($10.8M), Beachmint ($73.5M) particularly Home Mint with Justin Timberlake, and Trunk Club ($11M) to name a few. In the case of Dollar Shave Club, VCs confused the ability to create one funny viral video with the potential to create a “world-class brand” and real traction. Dollar Shave just experienced very ephemeral traction. Has anyone seen Dollar Shave’s second video? Exactly. Dollar Shave is a one video hit wonder – but it was supposed to a world class lifestyle brand. Since their Series A funding, Dollar Shave has not launched a single new product even though they are not doing anything proprietary. Product lead time isn’t that long especially when you have $10.8M. What are they up to now? Sucking. I’m not surprised either. They don’t have consumer product knowledge, relevant experience or passion to build an e-commerce consumer product goods brand. Their product and team aren’t compelling for what they’ve set out to do. Last month, they were giving away a free month of razors as if their razors weren’t cheap enough. But I digress…

The problem at hand is huge:

VCs continue to engage in cronyism and fund their friends’ startups with  crappy, undeveloped business models Those crappy startups are cluttering the ecosystem Those crappy start-ups are causing VCs to shy away from consumer investments to the detriment of other better consumer start-ups that need funding and, of course, The Series A Crunch: the crappy start-ups that got funding at high valuations are struggling and are unable to get Series B funding, dropping like flies, creating a tech bubble.

The worst part of this problem is again that many non-crony, first-time entrepreneurs, regardless of how compelling their business model is, how great their margins are and how great their team is, still get a hard time and struggle to get funding. VCs would rather fund Home Mint with Justin Timberlake even though Justin Timberlake is a ridiculous brand ambassador choice for an e-commerce home goods company. Even though many existing fashion curation sites like AHALife are struggling after being executed very well, they continue to fund fashion curation start-ups with founders that have less relevant experience and less compelling business models.

The only solution to this problem is a paradigm shift. VCs should learn the consumer space better. After creating a Series A Crunch by investing in crappy startups with bad teams, bad business models and bad products and giving them insanely inflated valuations, the responsibility of properly assessing startups is on them. My guess is that they are passing on a lot of potentially great deals because they don’t have the attention span to dive in deep and really understand a pitch that’s more than “My company is the Pinterest for the wedding industry.”They want easy business models because it’s less time consuming for them.

VCs should ask themselves:

Is the start-up really solving a problem? (Or are they creating a problem by sending me a bunch of randomly curated crap in the mail?) Who is on the team? Are they superstars from the industry? Do they have a track record of success? Do they have a real understanding of the market and consumer? (Do they realize that a bra subscription doesn’t make sense?) Is the product or brand special? Is it better than the competition? How so? Do you know who the competition is? (DollarShave forgot to make a special product and brand. Instead, they just focused on that one-hit wonder viral video.) What are the margins? Is the price too rich considering the quality and where the product is manufactured? Does the business model even make sense? Can the founders build a company even though they might not be the best fundraisers? Isn’t that more important anyway?

E-commerce is only going to continue to grow. It’s a huge opportunity for those who understand it. Many fashion tech companies lack real design and branding. Others have great design and branding but are not tech savvy. Some have unnecessary applications of tech, i.e. True & Co.’s algorithm for finding a better fitting bra. Very few start-ups are strong in design, branding and tech. It’s a great opportunity for start-ups, and VCs, that are both fashion/consumer and tech savvy to really stand out. Props to Marc Ecko’s new fund Artists And Instigators and Chris Birch’s new venture fund Acqua Ventures that focus on consumer start-ups. I hope more VC firms follow suit. VCs can choose to evolve or choose to entropy. I hope they choose evolution over entropy.

Get the convo going on Facebook or directly with Sindhya on Twitter: @Sindhya

** Correction: Taking a cue from some of my favorite journalists, I’m not embarrassed to correct myself. True&Co does have an algorithm but no subscription. It does in fact sell panties. However, first-time customers will not find them online within the first 10 minutes of being on their site. I couldn’t find the panties on their site yesterday. The algorithm-based questioning lasted about 2-3 minutes. Then I was presented with a multitude of bras. After looking around I got bored. I couldn’t find the panties. Not sure what’s worse…my mistake or that I couldn’t find the panties?

Original image created by Nesster

Sindhya Valloppillil (1 Posts)

Founder & CEO of Helix Men. Most recently, Sindhya was the Brand & Product Development Manager at ZIRH Skincare. While at ZIRH, she successfully created award-winning products and best-selling products in addition to creating and launching 3 brands: ZIRH Platinum Skincare, ZIRH Warrior Collection of Shower Gels & ZIRH Cocktail Bars. The ZIRH Platinum Drenched Moisturizer won the coveted CEW Beauty Award in 2009; it is the highest honor in the industry. Prior to ZIRH, she worked at Johnson & Johnson – Neutrogena Cosmetics, Limited Brands – Beauty Avenues and L’Oreal USA – Maybelline DMI in various roles including Marketing, Global Brand Image, Product Development and Innovations. She does Trend Consulting for the clients of Primary Global Research and Vista Research. Sindhya graduated with honors from the Global Fashion Management Masters program, a joint program with Fashion Institute of Technology in New York, Institut Francais de la Mode in Paris and Hong Kong Polytechnical Institute. Additionally, she has an undergraduate degree in Fashion Merchandising Management from FIT.

Buffy vs Edward Remix Unfairly Removed by Lionsgate

$
0
0

Comments:"Buffy vs Edward Remix Unfairly Removed by Lionsgate"

URL:http://www.rebelliouspixels.com/2013/buffy-vs-edward-remix-unfairly-removed-by-lionsgate


It has been three and a half years since I first uploaded my remix video “Buffy vs Edward: Twilight Remixed” to YouTube. The work is an example of fair use transformative storytelling which serves as a visual critique of gender roles and representations in modern pop culture vampire media.

Since I published the remix in 2009 it has been viewed over 3 million times on YouTube and fans have translated the subtitles into 30 different languages. It has been featured and written about by the LA Times, Boston Globe, Salon, Slate, Wired, Vanity Fair, Entertainment Weekly and discussed on NPR radio. It was nominated for a 2010 Webby Award in the best remix/mashup category. The video is used in law school programs, media studies courses and gender studies curricula across the country. The remix also ignited countless online debates over the troubling ways stalking-type behavior is often framed as deeply romantic in movie and television narratives.

This past summer, together with the Electronic Frontier Foundation, I even screened the remix for the US Copyright Office at the 2012 hearings on exemptions to the DMCA. Afterward my Buffy vs Edward remix was mentioned by name in the official recommendations by the US Copyright Office (pdf) on exemptions to the DMCA as an example of a transformative noncommercial video work.

“Based on the video evidence presented, the Register is able to conclude that diminished quality likely would impair the criticism and comment contained in noncommercial videos.  For example, the Register is able to perceive that Buffy vs Edward and other noncommercial videos would suffer significantly because of blurring and the loss of detail in characters’ expression and sense of depth.” -Recommendation of the Register of Copyrights, October 2012 (Page 133)

Despite the clear and rather unambiguous fair use argument that exists for the video, Lionsgate Entertainment has now abused YouTube’s system and filed a DMCA takedown and had my remix deleted for “copyright infringement”. Below is a brief chronicle of my struggle to get Buffy vs Edward back on YouTube where it belongs.

On October 9th 2012 I received a message from YouTube stating that Buffy vs Edward had “matched third party content” owned or licensed by Lionsgate and “ads may appear next to it”. Lionsgate acquired ownership of the Twilight movie franchise in 2012 (via the purchase of Summit Entertainment for 412 million dollars) so the claim appeared to be directed at the 1 minute 48 seconds of footage I quoted from the first Twilight movie in my 6 minute remix.

I always turn all ads off on my remix videos and never profit off them. But sure enough when I checked my channel, Lionsgate was monetizing my noncommercial fair use remix with ads for Nordstrom fall fashions which popped up over top of my gender critique of pop culture vampires. Incidentally this copyright claim also prevented the remix from playing on all iOS devices like iPads and iPhones becuase they are not ”monetized platforms“.

I thought perhaps YouTube’s Content ID System had automatically tagged the video and didn’t understand that it was a fair use. In the hopes I could get the mistake cleared up I immediately used YouTube’s built-in process to register a fair use dispute.

Less then 24 hours later however I received another message from YouTube informing me that Lionsgate had reviewed my fair use claim and rejected it, reinstating their claim on the remix and again monetizing the video with intrusive popup ads.

Concerned at what appeared to be a blatant disregard for fair use previsions, I contacted a lawyer at New Media Rights named Art Neill. New Media Rights drafted a rather detailed 1000 word legal argument citing case law and explaining how Buffy vs Edward was in fact about as clear of an example of fair use as exists. This included fair use arguments for the nature and purpose of the transformative use, amount used and market effect. YouTube’s built-in system now allows for a second round of copyright disputes, called an appeal process. So I returned to YouTube and filed an official appeal of the reinstated bogus copyright claim by Lionsgate using the fair use argument and legal language from my lawyer. (See the full text of the fair use argument we made here.)

On November 26th 2012, after a month of waiting, I finally got a response stating that Lionsgate had decided to release their copyright claim on my remix. Victory!

Or so I thought.

That same day I noticed another notification from YouTube saying that my Buffy vs Edward remix had “matched third party content” owned or licensed by Lionsgate and that ads may appear on my video. Wait what? Deja-vu. Hadn’t I just spent nearly 2 months dealing with exactly that? On closer inspection this new claim was for “visual content” owned by Lionsgate and the claim I had just fought and finally won had been for “audiovisual” content. No further information was provided as to what the difference was between the two claims or what content exactly was supposedly infringing.

It appeared as though Lionsgate just filed two separate infringement claims on the same piece of media.

 Confused and slightly frustrated I once again embarked on repeating the same dispute process as before. I filed my fair use dispute via YouTube’s built-in form exactly as I had the first time around.

Again, just like the first time, it was rejected by Lionsgate within 24 hours and they reinstated their claim on the remix.

So again I filed my second long-form appeal using YouTube’s system, again making the detailed legal arguments crafted by my lawyer at New Media Rights which again lay out very clearly all the fair use arguments. And again, I waited for a response.

On December 18th I received notification from YouTube that Lionsgate had again ignored my fair use arguments, rejected my appeal and this time had the remix deleted from YouTube entirely.

I was dumbfounded. And to add insult to injury I was now locked out of my YouTube account and had a copyright infringement “strike” placed on my channel.

In order to regain access to my account I was also forced to attend YouTube’s insulting “copyright school” and take a test on fair use. Since I’ve been giving lectures on fair use doctrine for artists and video makers for a number of years this was a breeze, but still insulting because my video was not infringing in the first place.

Once I was allowed back into my account I found  that YouTube is now penalizing me for this “strike” by preventing me from uploading videos longer than 15 minutes.

I consulted my lawyer again, and following the advice on YouTube’s copyright FAQ page, he reached out to the representatives of Lionsgate who administer their online content and  had issued the DMCA takedown. What he found out from that correspondence was worrying.

Representatives of Lionsgate, a company called MovieClips that claims to manage Lionsgate’s clips on Youtube, confirmed in an email to New Media Rights that they had filed a DMCA takedown on Buffy vs Edward because I did not want them to monetize the remix. In fact this is exactly what the company’s representative, Matty Van Schoor, said in a response email to New Media Rights on December 20, 2012.

“The audio/visual content of this video has been reviewed by our team as well as the YouTube content ID system and it has been determined that the video utilizes copyrighted works belonging to Lionsgate. Had our requestes to monetize this video not been disputed, we would have placed an ad on the cotent [sic] and allowed it to remain online. Unfortunately after appeal, we are left with no other option than to remove the content.”

No other option? How about recognizing it is fair use and dropping the complaint? They did not answer or even acknowledge our fair use arguments via email, despite fair use being raised multiple times. 

Perhaps this is just the action of a rogue studio, but it hints at a bit of a nightmare scenario for transformative media makers and remix artists. The fear is that fair use will be ignored in favor of a monetizing model in which media corporations will “allow” critical, educational and/or transformative works only if they can retain effective ownership and directly profit off them.

It appears that Lionsgate is attempting to do just that. What if every time The Daily Show made fun of a Fox News clip, News Corp. was allowed to claim ownership over the entire Daily Show episode in order to monetize it?

There are limitations on takedowns. For instance, as Neill from New Media Rights points out, the DMCA Section 512 prohibits knowingly, materially misrepresenting any information in takedown notices. At least one court, the case of the baby dancing to Prince in the Lenz case, has even required that DMCA takedown notice senders consider fair use before sending a takedown.

Buffy vs Edward has now been offline for 3 weeks. Over the past year, before the takedown, the remix had been viewed an average of  34,000 times per month.

Since none of YouTube’s internal systems were able to prevent this abuse by Lionsgate, and our direct outreach to the content owner hit a brick wall, with the help of New Media Rights I have now filed an official DMCA counter-notification with YouTube. Lionsgate has 14 days to either allow the remix back online or sue me. We will see what happens.

This is what a broken copyright enforcement system looks like.

One last note, New Media Rights has offered me invaluable advice and guidance throughout this battle. They are a small, non-profit two lawyer operation on a shoe-string budget fighting to make sure artists like me are heard. So if you can please consider donating to them here.

PS: Until we can get the takedown reversed, you can still watch the HTML5 popup video version of Buffy vs Edward here.

Kippt collects pretty useful links. — on startups — Medium

$
0
0

Comments:"Kippt collects pretty useful links. — on startups — Medium"

URL:https://medium.com/on-startups/e00babdcc5cb


It has been about 8 months since I started using Kippt every day and I am loving it. I use the service as a personal bookmarking tool, saving at least a few articles every day. I save interesting news, design related stuff, css tricks, coding tricks, coding tools, JS tools, and a whole slew of things. Instead of googling for stuff you’ve already seen before, instead of keeping information in your brain, you can just save it once and reference it later.

Single use is great, but can a bookmarking tool really get big? You bet!

I don’t just store my bookmarks in Kippt, I follow friends who like to read, friends who like to design and those that code. From them, every day, I get an aggregation of content that is relevant and useful to me. With absolutely no effort I get the best of the internet and it takes me less than a couple of minutes to sort through the news stream. Wow!

I am not alone either. By now I have about 16,000 followers on the service. I’d like to think they are getting as much value of my links as I get from the links from the people I follow. I can see the service is getting more and more tractions from the ever growing number of comments I receive from my shared links. Users thank each other for interesting finds and discuss what they share. Sounds like Pinterest for useful links? You bet! Just like how Pinterest collects articles of pretty useless things, Kippt collects pretty useful articles that help me learn and grow everyday that make me excited to use it everyday.

Kippt examplifies a service that compounds a single-user experience with the power of the network. Whether you use it by yourself or with friends, the tool creates value, every single time that you open it. What else should you want?

Personally, I hate it when people say “startup X is killing it.” Around Silicon Valley, I hear people say this too many time, and too many times the companies that are actually ‘killing it’ are the ones you’ve never heard of before. Kippt is quietly growing to be Pinterest for men and women who like to collect useful things.

Alright, enough of me blabbing. Go give it a try -> http://kippt.com

p.s. Thanks to Jenn for editing.


Dan McKinley :: Whom the Gods Would Destroy, They First Give Real-time Analytics

$
0
0

Comments:"Dan McKinley :: Whom the Gods Would Destroy, They First Give Real-time Analytics"

URL:http://mcfunley.com/whom-the-gods-would-destroy-they-first-give-real-time-analytics


Homer: There's three ways to do things. The right way, the wrong way, and the Max Power way! Bart: Isn't that the wrong way? Homer: Yeah. But faster! - "Homer to the Max"

Every few months, I try to talk someone down from building a real-time product analytics system. When I'm lucky, I can get to them early.

The turnaround time for most of the web analysis done at Etsy is at least 24 hours. This a ranking source of grousing. Decreasing this interval is periodically raised as a priority, either by engineers itching for a challenge or by others hoping to make decisions more rapidly. There are companies out there selling instant usage numbers, so why can't we have them?

Here's an excerpt from a manifesto demanding the construction of such a system. This was written several years ago by an otherwise brilliant individual, whom I respect. I have made a few omissions for brevity.

We believe in... Timeliness. I want the data to be at most 5 minutes old. So this is a near-real-time system. Comprehensiveness. No sampling. Complete data sets. Accuracy (how precise the data is). Everything should be accurate. Accessibility. Getting to meaningful data in Google Analytics is awful. To start with it's all 12 - 24 hours old, and this is a huge impediment to insight & action. Performance. Most reports / dashboards should render in under 5 seconds. Durability. Keep all stats for all time. I know this can get rather tough, but it's just text.

The 23-year-old programmer inside of me is salivating at the idea of building this. The burned out 27-year-old programmer inside of me is busy writing an email about how all of these demands, taken together, probably violate the CAP theorem somehow and also, hey, did you know that accuracy and precision are different?

But the 33-year-old programmer (who has long since beaten those demons into a bloody submission) sees the difficulty as irrelevant at best. Real-time analytics are undesirable. While there are many things wrong with our infrastructure, I would argue that the waiting is not one of those things.

Engineers might find this assertion more puzzling than most. I am sympathetic to this mindset, and I can understand why engineers are predisposed to see instantaneous A/B statistics as self-evidently positive. We monitor everything about our site in real time. Real-time metrics and graphing are the key to deploying 40 times daily with relative impunity. Measure anything, measure everything!

Part of the deploy dashboard at Etsy. We love up-to-the-minute graphs.

This line of thinking is a trap. It's important to divorce the concepts of operational metrics and product analytics. Confusing how we do things with how we decide which things to do is a fatal mistake.

So what is it that makes product analysis different? Well, there are many ways to screw yourself with real-time analytics. I will endeavor to list a few.

The first and most fundamental way is to disregard statistical significance testing entirely. This is a rookie mistake, but it's one that's made all of the time. Let's say you're testing a text change for a link on your website. Being an impatient person, you decide to do this over the course of an hour. You observe that 20 people in bucket A clicked, but 30 in bucket B clicked. Satisfied, and eager to move on, you choose bucket B. There are probably thousands of people doing this right now, and they're getting away with it.

This is a mistake because there's no measurement of how likely it is that the observation (20 clicks vs. 30 clicks) was due to chance. Suppose that we weren't measuring text on hyperlinks, but instead we were measuring two quarters to see if there was any difference between the two when flipped. As we flip, we could see a large gap between the number of heads received with either quarter. But since we're talking about quarters, it's more natural to suspect that that difference might be due to chance. Significance testing lets us ascertain how likely it is that this is the case.

A subtler error is to do significance testing, but to halt the experiment as soon as significance is measured. This is always a bad idea, and the problem is exacerbated by trying to make decisions far too quickly. Funny business with timeframes can coerce most A/B tests into statistical significance.

A simulation of flipping two fair coins. In the green regions, the difference in the number of heads is measured to be significant. If we stopped flipping in those regions, we would (incorrectly) decide the coins were different.

Depending on the change that's being made, making any decision based on a single day of data could be ill-conceived. Even if you think you have plenty of data, it's not farfetched to imagine that user behavior has its own rhythms. A conspicuous (if basic) example of this is that Etsy sees 30% more orders on Tuesdays than it does on Sundays.

Gratuitous infographic courtesy Brendan Sudol.

While the sale count itself might not skew a random test, user demographics could be different day over day. Or very likely, you could see a major difference in user behavior immediately upon releasing a change, only to watch it evaporate as users learn to use new functionality. Given all of these concerns, the conservative and reasonable stance is to only consider tests that last a few days or more.

One could certainly have a real-time analytics system without making any of these mistakes. (To be clear, I find this unlikely. Idle hands stoked by a stream of numbers are the devil's playthings.) But unless the intention is to make decisions with this data, one might wonder what the purpose of such a system could possibly be. Wasting the effort to erect complexity for which there is no use case is perhaps the worst of all of these possible pitfalls.

For all of these reasons I've come to view delayed analytics as positive. The turnaround time also imposes a welcome pressure on experimental design. People are more likely to think carefully about how their controls work and how they set up their measurements when there's no promise of immediate feedback.

Real-time web analytics is a seductive concept. It appeals to our desire for instant gratification. But the truth is that there are very few product decisions that can be made in real time, if there are any at all. Analysis is difficult enough already, without attempting to do it at speed.

Your life’s work by David of 37signals

$
0
0

Comments:"Your life’s work by David of 37signals"

URL:http://37signals.com/svn/posts/3389-your-lifes-work


I’d be happy if 37signals is the last place I work. In an industry so focused on the booms and busts, I find myself a kindred spirit with the firms of old. Places where people happily reported to work for 40 years, picking up a snazzy gold watch at the end as a token of life-long loyalty.

Committing myself to this long-term focus has led to a peaceful work atmosphere and an incredible clarity of purpose. If this is the last job I’ll ever have, I damn well better make sure that I like it. I won’t just tough things out. If shit is broken, we’ll fix it now, lest we be stuck with it for decades.

Two key ideas help inform this dedication. The first is Alistair Cockburn’s metaphor of software development as a co-operative game. Focusing on the residue of knowledge and practices carried over from game to game is far more important than worrying about the output of any one game.

Working people to death to ship any one feature or product is a poor strategy, as it reduces the capacity to ship the next feature or product (burn out, build-up of bad rush practices). It’s far more important to have a system for shipping that improves over the long term than one that heroically manages one monster push.

Second is Jeff Bezos’ idea: “What’s not going to change over the next 10 years?” If you’re going to stick around for decades, you’re better off making investments in things that’ll pay off for a very long time. It applies both to software and peopleware.

Of course, not everyone is at a stage in their life where they’re willing to settle down with a job for decades. But I find I enjoy working most with the people who are.

If you’re not committed to your life’s work in a company and with people you could endure for decades, are you making progress on it?

Learn APIs with Codecademy! | Codecademy

$
0
0

Comments:"Learn APIs with Codecademy! | Codecademy"

URL:http://www.codecademy.com/blog/52-introducing-api-lessons


When I worked at GroupMe before starting Codecademy, it always amazed me that the company started at a TechCrunch Disrupt hackathon. Its founders somehow built an awesome group texting application in less than 48 hours! How'd they do it? They built on top of another company's technology - Twilio, in this case - and used it to build an app of their own. Twilio sent the text messages, but GroupMe handled group formation, the interface, and more. A year after GroupMe was created at a hackathon, Skype bought it for more than $60m.

That's one of the many examples of the power of APIs - application programming interfaces. They exist to make it easy to interface with applications other people have built. Without APIs, hackathons would be much harder. APIs make it easy to create things - to make things that interface and interact with the real world and the technologies in it. For Twilio, this means interacting with phone numbers. For YouTube, it's with videos.

Codecademy has long taught people the basics of programming and how to build things like games and websites. It's always been our goal to help people create things - to make companies, products, and real-world applications. Today, we're one step closer with that. We worked with great companies like Youtube, NPR, Bitly, SoundCloud, Parse, and more to teach you how to build simple API apps. What can you do with these APIs? Build awesome websites with video with YouTube’s. Shorten links on the fly and grab stats with Bitly’s. Mash up the news with NPR’s. That’s just the beginning - we’ll be adding more APIs soon!

Programming is an amazing skill because it lets you create things on your computer. Using APIs makes that one step easier. It's often hard for developers (even professional ones!) to get up to date on the latest APIs and to learn how to use them. Dense documentation makes it nearly impossible to pick up an API and start programming immediately. These new Codecademy lessons should be just as helpful to experienced developers as they are to total newbies. They'll help you get up and running faster than ever.

Amazing products and projects have been built on the APIs that we're starting to teach today. This is just the start - if you have an API you'd like to teach or one you'd like to learn more about, let us know! Let's build something great together.

Get started now!

We’re going Open Source! | FileRock Blog

$
0
0

Comments:"We’re going Open Source! | FileRock Blog"

URL:http://blog.filerock.com/2012/12/were-going-open-source/


Today, we have proudly released the source code of FileRock Client. Go and check it out!

We have always said that our first priority is the security of your data. We protect your confidentiality by encrypting your files client-side. We protect the integrity of your data by making sure that you get notified if someone tampers with your files.

But the only way for you to really trust that we keep our promises, that we have built a fully secure cloud storage sync/backup client, is to examine the source code.

The code is now out there for all of you to review, try it out and send us your feedback. Note that you will still need a FileRock account in order to use FileRock Client: if you don’t have one, leave us your e-mail on our homepage and we’ll send you an invitation code.

FileRock Client is licensed with GPL version 3, so you can modify it and employ it for other projects, as long as they are kept under the GPL. Enjoy our code: if you use it yourself or hear of someone using it, let us know!

This entry was posted in security, source code. Bookmark the permalink.

What Startups Need To Know About Health Insurance in 2013

$
0
0

Comments:"What Startups Need To Know About Health Insurance in 2013"

URL:http://blog.simplyinsured.com/what-startups-need-to-know-about-health-insurance-in-2013/


by vivek on January 9, 2013

At SimplyInsured, we are avidly watching the changes in regulations and rates in the national health insurance market. As we head into 2013, we created a quick guide for startups summarizing the major changes to expect, and our advice on how to manage the changes.

Changes to Health Insurance in 2013

Health Insurance Rates are going up 10-30% in 2013 Most of the major health insurance companies are raising their rates by 10-30% in 2013 – many are going into effect in January. In California, the worst offender is Anthem Blue Cross – they have filed for a 26% price increase across the board. It’s not just California, either. Aetna’s CEO stated publicly that he expects health insurance premiums to “rise by 100% in some areas.” At SimplyInsured, we did our own analysis of prices from last year to this year, across public statements from the major insurance companies. Many large states are expecting significant growth in increase premiums in the coming year. Our Advice: Kaiser and Cigna have not announced major rate increases – yet. This is a great time to switch to one of those providers and lock in low rates for 6-12 months. If you’re on a company or group plan, you can lock your rates down for even longer; state regulations prevent insurance companies from increasing or decreasing rates by more than 10% in a given year. We have a comprehensive listing of the most affordable plans from all the major providers, which you can browse them at www.SimplyInsured.com. Employers must report health expenses on W-2′s Starting in 2013, employers must report any health insurance related spending to the IRS and employees on their W-2 form. Similar to 401K deposits and life/workers compensation insurance, this is a mandatory reporting requirement starting this year. The penalties for failing to disclose these numbers accurately can be steep. Nobody wants an IRS audit. Our Advice: Make sure you do this. Don’t get audited by Uncle Sam. Taxes are going up – but so are tax credits Taxes are going up by 1% on all incomes over $200K/year – which may affect some startup founders earning significant salaries or capital gains in 2013 or 2014. But – smaller startups can also save $5,000-$10,000 per year in taxes with the Small Business Health Care Tax Credit. Small Business Health Care Tax Credit One often overlooked beneift of the Affordable Care Act is a provision  known as “Small Business Health Care Tax Credit for Small Employers”. This tax benefit is for small businesses with up to 25 employees and average wages of up to $50,000 per employee – which is perfect for early stage startup companies looking to save $5,000-$10,000 in taxes. The exact wording of the law is pretty complicated, so we’ve included this simple calculator to help you figure out your savings. Our Advice: Make sure you speak to your accountant or tax advisor this year to take advantage of this tax credit. The credit actually increases by ~50% in 2014, meaning savings of $7,000-$15,000. The IRS reported that less than 7% of eligible companies took advantage last year. Make sure you’re in the savvy part of that statistic next year! Increased coverage transparency and plan choice There are two major benefits to transparency in 2013: the mandate of Summary of Benefits and Coverage (SBC) forms and the launch of Health Insurance Exchanges towards the end of the year. Summary of Benefits and Coverage (SBC) – the insurance cheat sheet One significant provision of the Affordable Care Act is to require all insurance companies to publish detailed, “plain English” explanations of the terms and benefits for every insurance plan. Unfortunately, the government mandated templates are still fairly confusing and full of legal jargon, but at least there is a standardized template with which to compare health insurance plans. We’ve attached a snapshot of the template below – the added detail will definitely benefit transparency for people wanting to do their homework. Our Advice: The mandated insurance templates are a great first step, but for easy comparison and “plain English,” check out SimplyInsured’s “What If” feature. Our algorithms model expected costs for ~100 conditions and medical emergencies – providing interactive, custom out-of-pocket estimates that are easily comparable between plans. We built the tool to be interactive and significantly easier to use than government templates. Health Insurance Exchanges: Starting October 1, 2013 – the federally managed Health Insurance Exchanges are required to be up and running for open enrollment. The exchanges are also mandated to be operating by January 1st, 2014. Why is this good? The exchanges will open up the insurance market to people “trapped” in their current employer insurance plans. The exchanges will give consumers freedom to choose health insurance from any plan and provider on the open market. In addition, several expanded Medicare and Medicaid subsidies will give options to low-income individuals and those suffering from pre-existing conditions. Our Advice: The insurance exchanges will be a blessing for many people, but they will not go into effect until early 2014. If you want to save money now, the best option is to find an affordable plan on the private market. Saving $100/month starting today is a sure win, compared to potentially saving money in the future when insurance premiums are higher.

We’ll keep you updated as we learn more – good luck in 2013!

Tagged as:affordable care act, health, health insurance, health insurance quotes, insurance, quotes, small business health insurance quotes, year in 2013

Vulnerabilities in Heroku

$
0
0

Comments:"Vulnerabilities in Heroku"

URL:http://stephensclafani.com/2013/01/09/vulnerabilities-in-heroku/


Recently while contemplating hosting options for my startup I decided to take a look at Heroku. Upon signing up I noticed that Heroku used a two-step sign up process. Multi-step sign up processes are notorious for containing security vulnerabilities, and after taking a closer look at Heroku’s I found that it was possible, given only their user ID, to obtain any user’s email address and to change their password.

Sign Up Vulnerability

In the first step of Heroku’s sign up process a user enters their email address:

Upon submitting the form the user is sent a confirmation email containing a link to activate their account. The activation link consists of the user’s ID and a token:

https://api.heroku.com/signup/accept2/1234567/ec2682960544872b5f7c0bcc1
2531a3a

Upon loading the activation link the user is prompted to set a password for their account. The email address that the user entered in the first step is displayed:

When the form is submitted a POST is made containing the user’s ID and the token from the activation link:

POST https://api.heroku.com/invitation2/save

id=1234567&token=ec2682960544872b5f7c0bcc12531a3a&user[password]=123456
&user[password_confirmation]=123456&user[receive_newsletter]=0&user[rec
eive_newsletter]=1&commit=Save

If the POST was made with the token parameter removed and the password fields left blank, the resulting error page would display the email address of any user whose ID was put as the value of the “id” parameter:

POST https://api.heroku.com/invitation2/save

id=any_users_id&user[password]=&user[password_confirmation]=&user[recei
ve_newsletter]=0&user[receive_newsletter]=1&commit=Save

If the POST was made with the token parameter removed and the password fields filled in, the user’s password would be changed:

POST https://api.heroku.com/invitation2/save

id=any_users_id&user[password]=123456&user[password_confirmation]=12345
6&user[receive_newsletter]=0&user[receive_newsletter]=1&commit=Save

Reset Password Vulnerability

A second vulnerability was found in Heroku’s reset password functionality. By modifying the POST request it was possible to reset the password of a random (nondeterministic) user each time that the vulnerability was used.

If a user has forgotten their Heroku password they can use the Reset Password form to reset it:

Upon submitting the form the user is sent an email containing a link with which they can reset their password. The link consists of an ID:

https://api.heroku.com/auth/finish_reset_password/5ffe213db0fc1543a1
335L70eb4273cfe

Upon loading the link the user is prompted to set a new password for their account:

When the form is submitted a POST is made containing the ID in both the URL and body of the request:

POST https://api.heroku.com/auth/finish_reset_password/5ffe213db0fc1543
a1335L70eb4273cfe

id=5ffe213db0fc1543a1335L70eb4273cfe&user_to_reset[password]=123456&use
r_to_reset[password_confirmation]=123456&commit=Save

If the POST was made with ID removed from both the URL and body, the password of a random account would be reset and the account automatically logged in to:

POST https://api.heroku.com/auth/finish_reset_password/

user_to_reset[password]=123456&user_to_reset[password_confirmation]=123
456&commit=Save

Disclosure

I reported these issues to Heroku on December 19. Initial fixes were in place within 24 hours. Heroku asked me to hold off on publishing a public disclosure so that they could do a review of their code which I agreed to.

Update: Heroku’s official response.

Despite finding these vulnerabilities I plan to host my startup at Heroku. Security vulnerabilities happen and Heroku handled the reports well.

Note: All of Heroku’s forms are protected against CSRF with an “authenticity_token” parameter. I removed the parameter from the above examples for clarity.

How many Raspberry Pis does it take… | Raspberry Pi

$
0
0

Comments:"How many Raspberry Pis does it take… | Raspberry Pi"

URL:http://www.raspberrypi.org/archives/3011


The folks at element 14/Premier Farnell announced today that they alone have now made and sold more than half a million Raspberry Pis. They’re only one of two official distributors; we don’t have completely up-to-date figures from RS Components yet, but Farnell’s news suggests that we’re well on the way to having sold our millionth Raspberry Pi.

Click to embiggen!

I’ve not got much more to say – but I will note that we’ll be opening a bottle of something fizzy tonight!

 


David Kendal: A man, a Plan.

$
0
0

Comments:"David Kendal: A man, a Plan."

URL:http://davidkendal.net/articles/2011/11/a-man-a-plan


In 2001, Paul Graham wrote Beating the Averages, a treatise on programming language design that turned out to be a precursor to the design essays for Arc.

On the whole, Beating the Averages holds up well 8 years since its last revision. I can’t possibly hope to write anything comparable without repeating it wholesale, so in a sense it has also become a design essay for Plan. This essay will serve as another, explaining where I think Graham is wrong, or at least why Arc hasn’t taken off even with its intended demographic.

Let’s throw out syntax right away. Hackers have long loved Perl, whose syntax bears at least as much complaint-per-user as Lisp’s. A good friend of mine who is new to Lisp (from PHP, with some Ruby and Python) recently commented that he in fact likes how regular its syntax is and how everything is expressed as a function — even the mathematical operators. And if Lisp’s lack of syntax is off-putting, there are always read-macros. (at least in Common Lisp and eventually in Plan) So we’re assuming that syntax is not a big deal to programmers who don’t know Lisp.

How can we find out what really makes a language popular? If we examine the mechanics of popularity, it might give us some clue.

So why is Arc, and Lisp in general, so unpopular? Paul Graham has said before that his goal isn’t popularity — at least not mass popularity. The popularity Graham seeks is with smart people (hackers). However, I think popularity with hackers and popularity with everyone else are not independent factors.

I think it’s important to attract the programmers who Graham says are “the masses” or non-hackers. Without them, the real hackers won’t come to your language en masse either.

Why?

The first reason is exposure and awareness. Though hackers pay great attention to new developments in programming languages, there is so much development happening that it’s impossible to be aware of every new thing. Even if a hacker heard of a programming language, he probably still won’t learn it. Because programming languages are so sticky, a hacker may just see the name of your language and skip over it, so there’s no increase in your language’s mindshare. He might find about just enough about it to see what influenced it and if it has interesting unique features — at least he’s more aware of it, but still isn’t programming in it daily.

It’s a big job to convince people to program in your language — hackers most of all. Hackers are skeptical, and because they’re so smart, they can respond inquisitively immediately to any argument favourable to you. But the one thing hackers can’t do is alter their subconscious. If people (even dumb people) keep talking about a programming language, it’s going to interest a hacker in it, simply because its name will be imprinted in his mind.

This doesn’t mean you can attract smart hackers to your language just by running an advertising campaign though. Sun tried that with Java, but a: their marketing was too explicit, and it just seemed like one company trying to push its own technology, and b: Java itself is a detestable language. So, in addition, it seems like language quality is not something you can make up for in marketing.

So the formula thus far is: exposure to people who’ll talk about it and use it + being a good technology = smart people working with your language.

Another reason it’s important to appeal to regular programmers as well as hackers is that hackers will write libraries and make your platform.

Hackers are quietly egotistical. They’re thrilled by people using their own code. If your nobody using your language needs libraries because they can all write their own code, nobody will release libraries. When they are released, only a small number of people will use them.

In other words, if you have people who need libraries, hackers will write them. When you have more libraries, your language looks more attractive to people who need them, so hackers write more of them, attracted by the desire to learn and solve interesting problems (even ones that have been solved before in other languages). You end up in a slow, but virtuous, cycle in which your platform looks increasingly attractive.

This is a simplistic view, of course. In reality, hackers use libraries, and non-hackers do publish code. But in practice this only speeds up the cycle, because it means hackers are also attracted by the existing code on your platform, and non-hackers are attracted by the ability to have their egos stimulated by seeing other people using things they made. The result: lots of smart people are using your language. (You also have a bunch of less-smart people, but they are learning all the time to program well and become smart.)

Another problem for Lisp is that the rest of the world doesn’t think or work like Lisp. It’s the same problem that plagued the Macintosh in the mid-90s: it had too many proprietary, Mac-specific modi operandi which ignored the industry standards. Networking is a good example: Ethernet on the Mac was never great before the iMac, because they wanted you to use LocalTalk. Even the Ethernet ports provided were not standard RJ-45 sockets, but rather a proprietary socket which required an adapter. (To be fair to Apple, there was no RJ-45 when they added Ethernet. But there was no reason to keep using the proprietary jack when the standard arrived a few years later.)

Lisp has a similar problem. For example: the way to write web-apps in Lisp, pioneered by Paul Graham, is to write them in continuation-passing style. You store closures on the server, giving them unique IDs in a hash table. When a user performs an action, the closure for that event is called. The problem: because closures can’t be stored in databases, you really have to use a hash table on your web daemon. So you can’t scale to more than one server because a user performing an action might end up on a different server to the one the set of closures generated by their last action is on, so the server goes looking for a nonexistent closure and the user gets an error. There are ways to cheat around this, but it’s a fundamental problem with the architecture.

So the conventional, stateless-plus-cookies way of building websites turns out to have an important, desirable property: it’s far easier to scale, because it was designed to be easy to scale. (Plus it’s a whole lot easier to have readable URLs when you’re generating random closure IDs to jam in the path when you load a page.)

There are similar examples where Lisp just wants to bend the real world to its own strengths, and the real world just won’t budge.

The fact is, the rest of the world often has a pretty good reason for doing things the way it does. Lisp people just want things to be easy for them, so they have their own idiosyncratic ways of working around the existing standards. It’s the equivalent of Apple letting AppleTalk run over Ethernet; it still cuts out the rest of the world.

It’s mostly by historical accident that Lisp doesn’t fit so well into the world of Unix. Scheme was designed in the mid-1970s, when Unix was not so well-known and still looked like a research operating system, and Common Lisp in the 1980s when Unix had still not won, so to fit its mission it had to be as system-agnostic as possible.

But it’s now the 2010s. It’s not hard to realise that people expect integer->char to be called chr, and to be able to call fork and have a new operating-system–level process appear.

(And, as if to complete my metaphor, Lisp’s networking remains woefully inadequate without implementation-specific functions.)

Just like the Mac was saved by OS X, which planted the classic Mac on top of a Unix core, and encouraging Mac developers to think like Unix hackers, Lisp needs to be saved by planting it on top of a load of Unix APIs, and encouraging Lisp hackers to drop some of their habits that work around (rather than with) the standards they have to follow. (This doesn’t mean we have to drop everything )

This also means Lisps should have an object system. I think a basic Lisp on its own provides enough functionality to implement a good-enough object system, in much the same way that Perl’s bless gave way to frameworks like Moose. With lexical closures, a native symbol kind, and cons pairs, we can do everything any other language’s object system can do.

Regular expressions, the standard Unix way to deal with text, have also been subject to ignorance and well-meaning Lisp-ification— so it’s important to have regular expressions as a first-class object in the language, with its own syntax. While storing regexps in strings is okay, it’s far cleaner to have a separation within the language.

a variety of people using it + being good and powerful + active library development + following the Unix way = success

Almost every programming language (at least those that are popular with hackers) today has all four elements going for it. Although there’s long way to go, I think we might be on the verge of a Lisp renaissance — but only if more languages can gain traction.

Adobe almost does something amazing by accident | Ars Technica

$
0
0

Comments:"Adobe almost does something amazing by accident | Ars Technica"

URL:http://arstechnica.com/information-technology/2013/01/adobe-almost-does-something-amazing-by-accident/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+arstechnica%2Findex+%28Ars+Technica+-+All+content%29


It seemed like an intriguing deal. An old version of Adobe Creative Suite—the 2005 vintage CS2, to be precise—became freely downloadable from Adobe, with nothing more than a free-to-create Adobe ID required from users. Although basically useless for Mac users, as CS2 is only available for PowerPC, for Windows users this is a powerful, if not quite cutting edge, suite of graphics apps.

This looked like a clever move from Adobe. Photoshop is widely held to be one of the most routinely pirated applications there is. In making an old but still servicable version of the software it appeared that Adobe was offering a good alternative to piracy: instead of using a knock-off copy of CS6, just use CS2.

A free CS2 would also go some way toward starving alternative applications of oxygen. Given the choice between a free copy of CS2 and downloading, say, the GIMP, one imagines that many users would plump for the commercial application. It's more of a known quantity, with a more polished user interface. And Photoshop is, frankly, the gold standard of bitmap image editing. Even an older version has a prestige that GIMP doesn't. This is not to say that CS2 is necessarily superior to the GIMP; it may or may not be. It doesn't really matter; Photoshop has a reputation and respect that the GIMP doesn't have, and even if some might argue that it was undeserved, it influences the decisions users make.

Giving away an old version in this way certainly appears unusual, and perhaps even a little brave for a commercial company such as Adobe. But Adobe is already being quite brave at the moment. The company is in many ways reinventing the way it both develops and licenses its products. It is creating a wide range of HTML5-oriented tools under the Edge brand that use a mix of open source and proprietary technology, and it is pushing hard its subscription software model with the Creative Cloud.

In this context, giving away an old version of its software doesn't seem quite so outlandish. It might sacrifice some revenue (though one suspects not all that much), but it strengthens Photoshop's dominance—and also makes Adobe look pretty good, to boot. And although an unusual move, it's not entirely unprecedented. Just last month, Microsoft made its previously commercial Expression suite freely downloadable after the company decided to cease further development. But this isn't quite the same; Creative Suite is still a going concern for Adobe. Expression isn't for Microsoft.

Unfortunately, it appears that Adobe wasn't really intending to give out CS2 for everyone. Shortly after news of the apparently free software spread across Twitter on Monday, the download page became unavailable, producing an error instead. Subsequent blog and forum posts indicate that this wasn't an inspired decision to liberate an obsolete but still useful application after all. It was something between a mistake, an error of judgement, and a misunderstanding.

CS2 used a product activation scheme to control licensing. When you install the software, it interrogates an Internet server to ensure that the license key you entered is acceptable. In December, Adobe retired the activation servers used by CS2. This posed a problem for CS2's licensed users, because without the activation servers, they can no longer reinstall the software.

To help these people out, Adobe offered versions of CS2 that didn't need activation. Mere entry of the serial numbers that Adobe put on the download page would suffice. The company says that although it looks like it was giving the software away for free, it in fact wasn't. It was just trying to assist its customers. Adobe says in order to legally use CS2, users still require a purchased license.

There are ways that Adobe could have helped out these users that didn't result in putting the software up on a server that anyone could get at. For example, the company could have released a patch that removed the activation checks from the applications and the license key entry from the installer. This could work with original media, and hence not require distribution of CS2. For whatever reason, the company decided not to go this route.

So it turns out that rather than doing something a little bit daring and unusual—something that might even inspire a new approach to licensing old, obsolete software—Adobe was doing something somewhat useful for existing, paid up, licensed users, in a rather peculiar way. This is a shame. The company could have earned a lot of goodwill by making CS2 free, and it would have been easy enough to offer a no-cost license for the software.

There is one final surprise. Originally, acquiring CS2 required an Adobe ID. It seemed a fair enough trade; Adobe knows your e-mail address and name, and in return you get some no-cost software. Since the whole issue blew up on Twitter, forcing the company to issue its clarification, perhaps one would have expected it to restrict access to the downloads, or use some other technique to remove the activation check.

It has not. Instead, Adobe has made CS2 even easier to get, by removing the Adobe ID requirement. The company created a new CS2 download page, and this time around, it had no registration requirement at all.

It's almost as if the company wanted people to download the software.

Update: Or perhaps not. The new download page has now been pulled. Alas. While it's still working for some people, for others, it's redirecting to a CS6 page.

gangnamstyle - PlayCanvas

Microsoft Migrating Messenger Users to Skype on March 15

$
0
0

Comments:"Microsoft Migrating Messenger Users to Skype on March 15"

URL:http://thenextweb.com/microsoft/2013/01/09/microsoft-emails-messenger-users-to-let-them-know-the-service-is-retiring-on-march-15-and-to-upgrade-to-skype/


Microsoft on Tuesday started mass emailing its 100 million+ Messenger users to let them know that the service is officially being retired on March 15, 2013. On that date, all users will be migrated to Skype, which Microsoft acquired back in May 2011 for $8.5 billion.

This means Messenger will be shut down in just 66 days. It will only keep working afterwards in mainland China, mainly because Skype is operated there by a local provider called TOM.

As you can see, the email in question is titled “Important info about your Messenger account.” Here’s the main part:

On 15th March 2013 we are retiring the existing Messenger service globally (except for mainland China where Messenger will continue to be available) and bringing the great features of Messenger and Skype together. Update to Skype and sign in using a Microsoft Account (same as your Messenger ID) and all your Messenger contacts will be at your fingertips. You’ll be able to instant message and video chat with them just like before, and also discover new ways of staying in touch with Skype on your mobile and tablet.

Microsoft previously said the transition would happen in Q1 2013. The date was only revealed today, however, confirming the company is still right on schedule to make the big move from Messenger to Skype.

If you’re wondering how this will work, it’s really quite simple. Skype will give you the option to merge your own accounts, as well as message both types of contacts.

Messenger users will have to download and install Skype (ideally the latest version). Once there, all they have to do is log in with their Microsoft account, and their Messenger contacts will be available inside.

This is possible since Microsoft moved its Messenger users over to Microsoft accounts a while ago, just as it brought Skype under the same umbrella. As such, the two have technically been connected for some time.

The discontinuation of Messenger came as a slight shock back in November. Yet Skype simply has a larger membership: around 280 million monthly active users, up roughly 100 million since it was purchased by Microsoft.

Many use both Messenger and Skype, but Microsoft wants everyone to just use the latter. It doesn’t make sense for the company to maintain and update two communication tools for consumers.

This is especially true if you remember that the whole tech world, including VoIP, is heading to mobile. Skype has made a big effort to build out various mobile apps, easily leaving Messenger in the dust.

The email contains instructions for downloading and installing Skype. It also features this brief FAQ:

So, what’s happening between now and 15th March? Messenger will continue to work as you know it today. If you are signed in with Messenger on your desktop** you will see a banner notification to upgrade. When you click on the banner, an installer window will open with the request to upgrade. This will take you through our installer flow to install Skype and automatically uninstall Messenger. So, what’s happening after 15th March? Messenger users on desktops** will not be able to sign in and will only be able to upgrade to Skype. If you attempt to sign in, a notification will appear, and if you continue, you will be taken through our installer flow to install Skype and automatically uninstall Messenger at the same time. Can I update to Skype on my mobile? Yes! Skype is available on iPhone, Android and soon on Windows Phone 8. We encourage you to download the latest Skype app on your mobile and then uninstall Messenger. You will be able to sign in to Skype on your iPhone, Android and Windows Phone 8 mobile apps with your Microsoft Account over the next few weeks. If you use another phone with Messenger on it, it will continue to work for a while. * To get group video chat, you’ll need a Skype Premium subscription. ** Newer versions of Messenger will be able to receive the optional upgrade notifications. Older versions will not receive the notifications and you will have to download Skype manually.

We’ll keep you posted as we get closer to the big day.

Image credit: Nevit Dilmen.

Why Lisp?

$
0
0

Comments:"Why Lisp?"

URL:http://lisperator.net/blog/why-lisp/


Aug

11

2012

As this new website is written in Lisp, I thought a first post to praise Lisp would be in order. If you're a Lisper, you know all this already ;-) and if you're not, I hope it inspires you.

Early 2010 I joined the IT team of a small research center in Italy. Our manager at the time was a smart guy, a programmer himself—Emacs user and Haskell lover. We started a big project and, although their initial plan was to develop it in PHP (having already a few PHP devs there), the manager agreed I can use whatever language I want to bring us faster to the objective. He'd prefer Haskell, but I don't know it. I was having some 10 years experience with JavaScript, but not on server-side (NodeJS was in early stages those days), 8 years with Perl and only a few months with Common Lisp. I picked Lisp and I still congratulate myself for that choice.

No dependency hell

Earlier this year I changed my old server for a better one. To do that I could have simply copied the full HDD contents from the old server to the new one, but my old server was a messy combination of Debian stable + testing + unstable and I thought the only way to clean the mess is by starting all over. Having a freshly reconfigured server, I had to reinstall all my Perl stuff, such as my old website. That was somewhat painful and I realized just how easy it is to deploy Common Lisp applications versus Perl (or anything else, probably).

With Lisp, you just build your application and get one binary; you copy that to your server and you're done. With Perl, you have to install a load of modules on the server; some of them are available in Debian via apt-get, others are only on CPAN, some require gcc/g++ and random C/C++ libraries you didn't even know you're using, and some simply fail to build anymore.

After 8 years of Perl, I can promise you that setting up a (web) development environment with Common Lisp is much much easier.

Not unreadable

In the years I've been using Common Lisp I noticed that I can look through code that I didn't touch in 2 years and immediately understand what it does.

More importantly, I can read code written by others and understand what they meant relatively easily. This is helped by a great feature of the environment (Emacs/SLIME) which provides cross-referencing tools, for example, I can place the cursor on the name of a function and ask “where is this defined”, and it jumps to the place of the definition, which could be in my own code or in third party code. But I can also ask “where is this function called” or “where is this variable referenced”. This is extremely useful for code refactoring and it beats every “modern” IDE I've seen.

Organic growth

In most other languages, the development cycle is, roughly, (1) edit, (2) compile (depending on the language), (3) run the program to test it. I can still remember the days when I had to run /etc/init.d/apache restart in order for Apache to pick up the changes I've done to my Perl modules.

In Lisp things are different. You're not just writing your program; you're growing it. Lisp is alive. Your program starts from nothing—you really just start the REPL—and then you keep adding to it and changing it as it runs. You talk to it all the time; you can inspect and modify data and functions as the program runs; even when problems arise, such as a hard error, you can drop into a debugger and inspect local variables at various stack frames, or even evaluate arbitrary expressions in a certain stack frame.

Many of these ideas, which Lisp pioneered, were implemented later in other languages. For example you can attach the GDB debugger to a running process and inspect live data; however, you cannot change the running process, like for example you can't redefine functions.

The only way to understand how great this is is to try it; you won't get it from this short blog post, nor from a thousand pages book.

Stable as a rock

When I was younger, I had the feeling that if some software library was not updated in 5 years it's unmaintained and it's probably broken. That, sadly, is usually the case with many programming languages—because they're moving targets. They change frequently and they depend on lots of external factors—operating system, compilers, shared libraries etc.

That doesn't seem to apply to Common Lisp. The language is extremely stable. You often find code written a decade ago that still functions properly. You might think that this happens because Common Lisp did not “evolve”, and that's one face of the truth.

Programmable programming language

The fact is, Common Lisp is powerful enough that you can modify the language from itself, without the need for a new spec (and by that I don't simply mean that you can define functions and variables). I don't know of a single programming paradigm that wasn't brought to CL; in fact most paradigms, and even whole programming languages, were prototyped in Lisp long before they had a life on their own.

Support for macros, of course, is most of what makes this possible. But the fun part is that even if Lisp didn't have macros, you could implement them in Lisp itself with a bit of effort. That can't be said about other languages because it's only the Lisp syntax that makes macros possible. There were attempts to add macros to other languages (most notably C's #define) but they're not even close to Lisp macros. But of course, Lispers know this already, and non-Lispers still wonder what macros are good for.

Going further than “ordinary” macros, you can define reader macros. Experienced Lispers avoid them unless they're really necessary. The extent to which you can change the language is only limited by your imagination, and that's probably why the standard was not updated in almost twenty years; it's very good as it is; like the United States Constitution, it gives power to the people. I guess nobody wants to change that (well, not the people, anyway).

Lisp is simple!

The Lisp beginner might read this in disbelief. I know for myself, before using Lisp I thought that it's a weird language and that it must be extremely complicated. Nothing is further from the truth, but the truth is hard to see when your brain is wired to think in terms of for and while statements, in blocks delimited by { curly brackets }, and especially when you expect the language to read your mind about operator precedence.

A Lisp interpreter can be written in just a few lines of code, as this great paper by Paul Graham shows. My first Lisp interpreter was written in Perl, based only on ideas in this document and with a quick and dirty parser.

Like mathematics, Lisp is based on just a few axioms. Common Lisp did get big and complex—like mathematics—but it's written in a simpler Lisp, which is in turn written in an even simpler Lisp. The analogy goes on—complex mathematics is defined in terms of simpler mathematics, which is itself defined in terms of even simpler maths. Eventually, it all goes down to the void set. It's based on absolute, incontestable truths.

If you bootstrap a very simple Lisp compiler, you can then define and redefine the language in terms of itself. I did this and it was a fabulous experience. If you want to learn Lisp but you fear it's complex, my suggestion is to start looking at it from the bottom, from the simple things. Just as you learn maths.

So this new website is powered by Common Lisp. I expect the code I wrote these days to function for decades, but this of course depends on other factors (there's a lot of CSS, HTML and JavaScript into it).

This is actually the first website I do in Common Lisp. I make a difference between websites and Web applications—in many ways, a website is harder to make. I've been working on a CL tool to make this easier—Sytes—using this tool and some other neat CL libraries like Quicklisp, buildapp and sb-daemon, I think it's as easy now to get started with writing websites in CL as you can do with PHP. I hope to prove that soon.

In the mean time, if you want to get in touch you can email me or leave a comment in these pages. Comments are moderated, because I'm fed up with comment SPAM; this means I'll have to approve your comment in order for it to appear on the website, but if you enter your email address you'll receive an email and will be able to validate your comment yourself. Your email address is not published.

Add your comment

Viewing all 9315 articles
Browse latest View live