Norman Walsh (Sun)DocBook and HTML 5(.x)

<article class="essay" id="content" lang="en">

HTML 5(.x) today reminds me a lot of DocBook 1(.x) twenty years ago. That's neither criticism nor compliment, merely observation.

Some random wind blew the HTML 5(.1) “picture” element across my desk today. That lead me to a page somewhere that enumerated all of the proposals for HTML 5.x elements in their various stages of standardization.

That's drifted back through my consciousness several times today until finally I realized why. The reason is: it reminds me a whole lot of DocBook twenty years ago. Hear me out.

Twenty years ago, DocBook had a relatively small number of tags. Like HTML of today, it had enough markup to do articles and sections and paragraphs and images and block quotations and a short list of other things.

Twenty years ago, DocBook had a selection of specialized elements in addition to the basic structural elements necessary to capture expository prose. HTML has them too; the specializations are different, but that's not surprising.

DocBook was about interchange so there was a fairly diligent effort undertaken to make sure that the processing expectations of each element were clearly defined. The variety of outputs imagined and the fact that the DocBook community had nearly no appreciable influence over the development of the platforms that would support those outputs meant that there was a certain vagueness, but we have always cared. HTML, the specification, goes to great lengths to describe the processing expectations of…everything, not just proper, valid markup but essentially every sequence of characters. HTML is as much about interoperability of browsers as anything else and so there's tremendous effort undertaken to insure that interoperability.

DocBook had a relatively large and diverse community of users (some significant fraction of techpubs plus a smattering of other fields of publication). Ok, HTML's relatively large and diverse community (basically everyone everywhere) eclipses the DocBook community the way the population of beetles on the earth dwarfs, say, the human population of Rhode Island, but we're talking relative scales.

An interesting thing about a large and diverse community of users is that they have different interests and different requirements. And if the community is big enough, you wind up with tags that are of interest to “a large group of people” who are still a relatively small group compared to the whole. DocBook certainly has markup that “almost everyone” never uses, and that I sometimes wish we hadn't invented, because various groups of users, perceived at the time to be of significant size, were able to make a compelling case for it.

HTML, like DocBook, has a committee of developers who respond to requests for new tags, proposals for new tags, proposals for changes to tags, proposals for extensions to tags, proposals for the removal of tags, etc. And like any committee, it attempts to establish guidelines and policies and undertakes to serve its community as best it can.

DocBook today has a quite large set of tags. Large enough that lots of folks want subsets. I don't know if HTML has become that large yet, but I bet it will.

HTML's evolution is never going to more than superficially resemble DocBook's evolution. The HTML community has direct and compelling influence on the platform that supports it (or maybe it's the other way around). DocBook still focuses on encoding technical documents; most of the HTML effort seems to be about developing an open, portable application development framework. Nothing wrong with that except to the extent that it seems to marginalize other goals for the web which, no doubt, one could argue it doesn't.

There's nothing profound in these observations, but I look forward to seeing what HTML is like in twenty years. And DocBook too, of course. I wonder if HTML will have twenty year old legacy markup that almost no one uses or if they'll be able to keep things tidier. The fact that HTML is being developed for effectively a single, global platform (or a platform that appears to be that way from most angles) means there's more opportunity for deprecation, I suppose.


Norman Walsh (Sun)The short-form week of 14–20 Jul 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 1 message in 8 conversations. (With 5 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

Monday at 09:56am

@ndw Perhaps time to update your bio here?: —@ronhitchens

Tuesday at 12:46pm

New Weird Al video is fantastic: Word Crimes—@codinghorror

Saturday at 08:46pm

RT @geoffarnold: This *should* be the Democratic manifesto. Not holding my breath... —@ndw

Sunday at 07:13am

XML Stars, the journal is out! Stories via @ndw —@dominixml

Sunday at 01:46pm

No, not a Constitutional scholar, just…watch this closely. Church…here. State…there. Separate. Would a picture help?—@StephenKing

Sunday at 02:47pm

Real #standards aren't cool until someone declares you dead, or copies you into JSON and claims they invented it. @jonlehtinen @gkirkpatrick —@JamieXML

Sunday at 03:45pm

No, You Don’t Have a Right to be Forgotten—@dtunkelang

Sunday at 08:28pm

How often do I make chemistry jokes?... Periodically.—@SciencePorn

Norman Walsh (Sun)The short-form week of 7–13 Jul 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 11 messages in 18 conversations. (With 8 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

In a conversation that started on Monday at 03:46pm

Yes,,,, and a few other sites are down. Should be back tomorrow.—@ndw
@ndw is down (used in your flickr1.xsl library). Is this related to the other sites?—@erikespana
@ndw I noticed it today. Do you have an estimate of when that file will be back online?—@erikespana
@ndw What about hosting the file on your github account, until your websites are restored?—@erikespana
@erikespana I'll see what I can do today. Not sure what the story is...—@ndw
@erikespana I didn't think that was down. Other sites still are :-(—@ndw

In a conversation that started on Sunday at 06:01pm

Note to self: eat more oysters.—@ndw
@ndw Whilst avoiding the bad ones?—@dpawson

Monday at 10:50am

RT @BadAstronomer: Dear media: Stop putting anti-science reality-denying crackpots on your news shows. —@ndw

Tuesday at 07:14am

XML Stars, the journal is out! Stories via @JamieXML @ndw —@dominixml

Tuesday at 06:23pm

Only an engineer would to this —@SciencePorn

Tuesday at 06:51pm

Looking down on the crater from Haleakalā. —@dtunkelang

Tuesday at 10:51pm

Wednesday at 07:48am

The optimist says the glass is half full. The pessimist says half empty. The scientist says, "why isn't it full of coffee?"—@SciencePorn

In a conversation that started on Wednesday at 08:16am

@peteaven New phone?—@ndw
@ndw yessir. 5s. having fun with slo-mo. all vids are much more dramatic now. oh, and it's a phone too it seems!—@peteaven

Wednesday at 08:19am

RT @geoffarnold: Money quote: "This is what happens in a surveillance state: to inoculate themselves against suspicion, people... http://t.… —@ndw

Wednesday at 04:25pm

Got a Sonos:1 for a small shelf. Worried that power cord might be a problem; discovered cord elegantly recessed into the base. #designwin —@ndw

Wednesday at 08:53pm

You're a ghost driving a meat coated skeleton made from stardust, what do you have to be scared of? ~@porkbeard —@pickover

In a conversation that started on Wednesday at 09:14pm

All is right with the world. Or at least my little corner of it. Weblog and ancillary sites back online. Sorry about that folks.—@ndw
@ndw is not working for me, is that in your corner?—@apb1704
@apb1704 Don't think that was effected and seems up for me. Still having trouble?—@ndw
@ndw Up now, maybe was local problem for me. thanks.—@apb1704
@ndw is still down for me. Any idea when that'll be online again?—@erikespana

Thursday at 07:14am

XML Stars, the journal is out! Stories via @RealMichaelKay @ndw @JamieXML —@dominixml

Thursday at 11:56am

“We have to do the HTML5 blockquote with seven nested tags inside it because, semantics.”—@zeldman

Thursday at 11:47pm

American out of office: "I'm on vacation but will check email hourly. Reach me on my mobile." European: "I am unavailable until September."—@shanselman

Friday at 10:51pm

Sunday at 10:50am

Objective: custard. Achievement: very sweet, slightly scrambled eggs. #dangit —@ndw

ProgrammableWeb: APIsGlitxt

Glitxt is designed to let users encode text in an image in a way that conceals the text but makes it obvious to the human eye that something is hidden in the image. This encryption method makes it difficult for automated data miners to extract the encrypted data, but it is not an advanced encryption method. The Glitxt API allows users to encode text in an image, decode an image to get the text, or send a ping request to check if the API is working.
Date Updated: 2014-07-23
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsPemilu Election Results

The Pemilu Election Results API allows users to retrieve information on Indonesia's 2014 election results. These results can be sorted by party, province, electoral district, legislative body, and election year. The Election Results API is part of API Pemilu, a collection of APIs that provide Indonesia's approximately 187 million voters with important election information.
Date Updated: 2014-07-23
Tags: [field_primary_category], [field_secondary_categories]

Jeremy Keith (Adactio)Adactibots

I post a few links on this site every day—around 4 or 5, on average. If you subscribe to the RSS feed, then you’ll know about them (I also push them to Delicious but I don’t recommend relying on that).

If you don’t use RSS—you lawnoffgetting youngster, you—then you’d pretty much have to actually visit my website to see what I’m linking to. How quaint!

Here, let me throw you a bone in the shape of a Twitter bot. You can now follow @adactioLinks.

I made a little If This, Then That recipe which will ping the RSS feed and update the Twitter account whenever there’s a new link.

I’ve done same thing for my journal (or “blog”, short for “weblog”, if you will). You can either subscribe to the journal’s RSS feed or decide that that’s far too much hassle, and just follow @adactioJournal on Twitter instead.

The journal postings are far less frequent than the links. But I still figured I’d provide a separate, automated Twitter account because I do not want to be that guy saying “In case you missed it earlier…” from my human account …although technically, even my non-bot account is auto-generated: my status updates start life as notes on—Twitter just gets a copy.

There’s also @adactioArticles for longer-form articles and talk transcripts but that’s very, very infrequent—just a few posts a year.

So these Twitter accounts correspond to different posts on in decreasing order of frequency:

Amazon Web ServicesBig Data Update - New Blog and New Web-Based Training

The topic of big data comes up almost every time I meet with current or potential AWS customers. They want to store, process, and extract meaning from data sets that seemingly grow in size with every meeting.

In order to help our customers to understand the full spectrum of AWS resources that are available for use on their big data problems, we are introducing two new resources -- a new AWS Big Data Blog and web-based training on Big Data Technology Fundamentals.

AWS Big Data Blog
The AWS Big Data Blog is a way for data scientists and developers to learn big data best practices, discover which managed AWS Big Data services are the best fit for their use case, and help them get started on AWS big data services. Our goal is to make this the hub for developers to discover new ways to collect, store, clean, process, and visualize data at any scale.

Readers will find short tutorials with code samples, case studies that demonstrate the unique benefits of doing big data on AWS, new feature announcements, partner- and customer-generated demos and tutorials, and tips and best practices for using AWS big data services.

The first two posts on the blog show you how to Build a Recommender with Apache Mahout on Amazon Elastic MapReduce and how to Power Gaming Applications with Amazon DynamoDB.

Big Data Training
If you are looking for a structured way to learn more about the tools, techniques, and options available to you as you learn more about big data, our new web-based Big Data Technology Fundamentals course should be of interest to you.

You should plan to spend about three hours going through this course. You will first learn how to identify common tools and technologies that can be used to create big data solutions. Then you will gain an understanding of the MapReduce framework, including the map, shuffle and sort, and reduce components. Finally, you will learn how to use the Pig and Hive programming frameworks to analyze and query large amounts of data.

You will need a working knowledge of programming in Java, C#, or a similar language in order to fully benefit from this training course.

The web-based course is offered at no charge, and can be used on its own or to prepare for our instructor-led Big Data on AWS course.

-- Jeff;

Jeremy Keith (Adactio)Indie Web Camp Brighton

If you’re coming to this year’s dConstruct here in Brighton on September 5th—and you really, really should—then consider sticking around for the weekend.

Not only will there be the fantastic annual Maker Faire on Saturday, September 6th, but there’s also an Indie Web Camp happening at 68 Middle Street on the Saturday and Sunday.

We had an Indie Web Camp right after last year’s dConstruct and it was really good fun …and very productive to boot. The format works really well: one day of discussions and brainstorming, and one day of hacking, designing, and building.

So if you find yourself agreeing with the design principles of the Indie Web, be sure to come along. Add yourself to the list of attendees.

If you’re coming from outside Brighton for the dConstruct/Indie Web weekend, take a look at the dConstruct page on AirBnB for some accommodation ideas at very reasonable rates.

Speaking of reasonable rates… just between you and me, I’ve created a discount code for any Indie Web Campers who are coming to dConstruct. Use the discount code “indieweb” to shave £25 off the ticket price (bringing it down to £125 + VAT). And trust me, you do not want to miss this year’s dConstruct.

It’s just a little over six weeks until the best weekend in Brighton. I hope I’ll see you here then.

Shelley Powers (Burningbird)Responding to Charity Navigator's DA on the Humane Society of the United States

circus elephants on parade

Courtesy of the Boston Public Library, Leslie Jones Collection.

I was sent a link to a story and asked if it was true. The story noted that Charity Navigator, the charity watch dog group, had attached a Donor Advisory to the Humane Society of the United State's listing, specifically because of the lawsuits related to the Ringling Brothers circus.

I was astonished. A donor advisory because of a single Endangered Species Act lawsuit? Many nonprofits are involved in lawsuits as they work to achieve the goals that are part of their underlying mission. I have a hefty annual PACER (federal court document system) fee because of the documents I download for the numerous environmental and animal welfare cases I follow—and I'm only following a tiny fraction of the cases I'd really like to follow.

Was the Donor Advisory given because the animal welfare groups lost the case? I would hope not, because penalizing nonprofits for taking a chance in court would have a chilling effect on their ability to do their work.

Was the Advisory given, then, because they also entered into a settlement for attorney fees? That seems to be more likely, especially considering the hefty size of the attorney fee settlement ($15 million). However, that a single incident related to a single court case would override 60 years of history in the Charity Navigator's decision seemed both capricious and arbitrary. If civil lawsuits were not part of the arsenal of the organization, or if HSUS was in the habit of losing these cases and having to pay hefty attorney fees on a regular basis, then I think it would give most people pause before donating—but a single instance? Frankly, my first reaction was, "Well, aren't you the precious."

Charity Navigator also referenced the fact that Ringling Brothers filed a counter-lawsuit against the animal welfare organizations based on RICO—the Racketeering law. The reference to RICO does sound serious, if it weren't for the fact that because of the RICO law's overly loose design, and due to the Supreme Court's over-reliance on the "intent" of Congress when passing the law, RICO's purpose has been badly muddied over the years. Now, rather than go after the Mafia or sophisticated white-collar criminal networks, RICO has become a highly tempting tool in corporate America's tool belt, especially after the recent findings in the Chevron RICO lawsuit related to the earlier lawsuit brought by poor Ecuadorians against the oil company for environmental damage to their lands.

Regardless, neither lawsuit—the original Endangered Species Act lawsuit brought by the animal welfare groups (not including HSUS), or the RICO case—ever reached a decision on the merits. The former was dismissed because of lack of standing, and the second never went to trial. As part of the attorney fee settlement, Feld Entertainment (parent company for the circus) agreed to dismiss the RICO lawsuit. The fact that the corporation filed a complaint should be seen as irrelevant and not figure into any agency's determination of whether the organizations involved are sound or not. Not unless Charity Navigator believes that all one has to do is file a complaint in court and it's automatically taken as true.

Charity Navigator noted the reasons why the Judge dismissed the ESA case for lack of standing, though the agency's understanding of the legal documents and associated time line of all the events are equally confused and inaccurate. For one, the agency stated that Feld filed the RICO lawsuit after the ESA case was decided. Feld originally filed the RICO lawsuit in 2007 when Judge Sullivan denied the company's request to amend its answer and assert a RICO counter-claim. The new lawsuit was stayed until the ESA case was decided in 2009, and Feld amended its original complaint in 2010, when the RICO case started up again.

I wanted to pull out part of the memorandum Judge Sullivan wrote in 2007 when he rejected Feld Entertainment's request to amend their answer (leading to the RICO lawsuit). It relates to Feld's implication that the animal welfare groups were involved in a complex and corrupt scheme to pay their co-plaintiff, Tom Rider that the company lawyers claimed they didn't know about until 2006.

Finally, the Court cannot ignore the fact that defendant has been aware that plaintiff Tom Rider has been receiving payments from the plaintiff organizations for more than two years. Although defendant alleges an “elaborate cover-up” that prevented it from becoming “fully aware of the extent, mechanics, and purpose of the payment scheme until at least June 30, 2006,” Def.’s Mot. to Amend at 4, such a statement ignores the evidence in this case that was available to defendant before June 30, 2006 and does not excuse defendant’s delay from June 30 forward. Plaintiffs’ counsel admitted in open court on September 16, 2005 that the plaintiff organizations provided grants to Tom Rider to “speak out about what really happened” when he worked at the circus.

In other words, Feld's lawyers found out about the "elaborate scheme" to fund Tom Rider, because the animal welfare groups mentioned funding Tom Rider during a court hearing in 2005.

As for that funding, it is true that the animal welfare groups paid Tom Rider about $190,000 over close to ten years. However, what isn't noted is that some of that "money" wasn't money at all. Rider was given a computer, a cell phone to keep in contact with the groups, a used van so he could travel around the country speaking out about the trial and his experiences with the circus, and various other goods. The groups also provided IRS forms for years 2000 through 2006 for Rider. When I added up the income for these years, it came to $152,176.00. However, after all of Tom Rider's expenses were deducted, over the seven years he "took home" a total of $12,582, for an average of $149.78 a month. That's to pay for all of his personal expenses—including a cheap dark blue polyester suit and equally cheap white shirt and tie he wore to the trial. (Tom Rider must have stood out for the plainness of his garb when next to Feld Entertainment's $825.00 an hour DC lawyers during the trial.)

Among the small selection of oddly one-sided court documents that Charity Navigator linked, another was the Judge Sullivan decision denying the animal welfare group's motion to dismiss the RICO case. What stands out in this document is a reference to the original Judge Sullivan decision, specifically a comment about the Rider funding:

The Court further found that the ESA plaintiffs had been “less than forthcoming about the extent of the payments to Mr. Rider.”

I compare this statement with Sullivan's statement I quoted earlier, wherein Sullivan denied Feld's request to amend its complaint because of the supposed underhanded and secret funding—an assertion that Sullivan rejected in 2007. The newer constradictory 2009 statement was just one of the many inconsistencies in Judge Sullivan's decisions over the years related to these two cases.

But the last issue that Charity Navigator seemed to fixate on was Feld's attempt to get confidential donor lists from the animal welfare groups. I've written about this request, and my great disappointment in Judge Facciola's decision to grant the request.

Nothing will ever convince me this wasn't a bad decision, with the potential to set an extremely bad precedent. Even when the discovery was limited primarily to those people who attended a single event, it's appalling that a confidential donor lists can be given to a corporation who represents everything the donors loath and disdain—and a corporation with a particularly bad record when it comes to dealing with animal welfare groups and other people—not to mention its abysmal record when it comes to its animal acts.

The animal welfare groups settled because when you have a billionaire throwing $825.00 an hour lawyers at a case, and said billionaire doesn't care how much it costs to win, it didn't make sense to continue fighting a fight that was already stacked against them. When Judge Sullivan ruled on the ESA case, he should have recused himself in the RICO case, because to rule favorably for the animal welfare groups in the RICO case would be to say he was inherently mistaken in many of his assertions in the ESA case. When he turned the case over to the Magistrate Judge, Judge Facciola should have exercised independent thinking rather than just continue to parrot Judge Sullivan. And the groups would continue to spend way too much money fighting a lawsuit that the other side would deliberately stretch out as long as it possibly could.

Top all that with the threat to the anonymity of their donors, and the groups settled. Point of fact, if they settled specifically to protect their donors, more power to them. They should be commended for doing so, not punished.

What's ironic is in my original posts on the donor list request, I noted that if the animal welfare groups had to give these lists out, it would most likely impact on their ratings in sites such as Charity Navigator. Never in my wildest dreams did I expect that Charity Navigator would give a donor advisory to the groups just because a judge ordered that the list be provided, not that they were provided. The groups had planned on appealing this ruling before they settled, and frankly, I think they had a good chance of winning the appeal. But the very fact that a no longer existing possibility of an event is enough to trigger a donor advisory leaves me to wonder how many more innocent nonprofits will be labeled with a donor advisory just because someone sent in a newspaper article about the possibility of an event?

Kenneth Feld's $825.00 an hour lead attorney, John Simpson, was recently interviewed for a legal publication. In it, he spoke about the donor list;

They didn't want a situation where I’m taking the deposition of some donor asking — if you knew they were going to take this money to pay a witness, would you have given this donation?” Simpson said. “I don’t think they wanted that kind of discovery to take place. Some people might have made the donation anyway. But most of these people would have said — no, I wouldn't have done that. And you would have been in the middle of their donor relations and potentially cutting off their donations in the future.”

In actuality, the one fund raiser that was at issue in the donor list request did specifically state that the money was for the lawsuit, and other requests for funds specifically stated the money was for Tom Rider's media campaign. I, for one, was concerned about what would happen to individuals put into an intimidating situation by a high priced, DC powerhouse attorney. Mr. Simpson has a way of asking questions in depositions, and then subsequently paraphrasing the responses so that even the most innocent and naive utterance seems dark, and dastardly. It was unfortunate that Judge Sullivan allowed his scarcely concealed disdain for Tom Rider to lead him to basically accept whatever Sullivan said, even though the animal welfare groups presented solid arguments in defense.

Lastly, Charity Navigator linked an article in the Washington Examiner, as if this was further evidence of good reasoning for the donor advisory. Might as well link Fox News as a character reference for the EPA, or The Daily Caller as a reasoned source of news for President Obama.

Just because something shows up in a publication online does not make what's stated truth, or even reliable opinion. That a charity watch dog would link a publication known for its political and social bias, as some form of justification for a decision only undermines its own credibility. Yes, the HSUS and the FFA are involved in lawsuits with a couple of insurance companies regarding their liability coverage. As noted, though, it's common for insurance companies to deny claims of liability when it comes to litigation fees. Kenneth Feld, himself, is involved in a lawsuit with his insurance company about it not wanting to pay those $825.00 an hour fees for Feld's attorneys in the lawsuit with his sister.

However, there were several insurance companies involved with the groups and this court case. One way or another most, if not all, of the attorney fee settlement will be paid by one or more insurance companies.

An interesting side note about the insurance company lawsuits is the fact that the Humane Society's lawsuit is being handled in federal court, while the Fund For Animals lawsuit is being managed in the Maryland state court system. This disproves one Feld Entertainment claim that HSUS and FFA are one organization (and hence, justifying Feld's dragging HSUS into the lawsuit). The reason for the lawsuit split is that FFA is a Maryland corporation, while HSUS is not, and the insurance company was able to argue that it could move the HSUS case to the federal level because of jurisdictional diversity. Nothing more succinctly demonstrates that FFA and HSUS are not the same corporate organization. Yet HSUS has received a donor advisory for a lawsuit it was never involved in. FFA was involved in the ESA suit, but not HSUS.

There is so much to this case, too much to cover in a single writing, but I did want to touch on the major points given by Charity Navigator in its donor advisory. Will the advisory hurt an organization like HSUS? Unlikely. The Humane Society of the United States is one of the older, more established, and largest animal welfare organizations in the country. Its charity ratings to this point have been excellent. A reputable organization like the BBB lists it as an accredited charity, and one only has to do a quick search online to see that it is currently involved in many different animal welfare efforts across the country—from rescuing animals in North Carolina to defending American burros. If people donate or not to the organization it won't be because of Charity Navigator's listing, because most people wouldn't need Charity Navigator to learn more about the HSUS.

But such donor advisories could negatively impact on lesser known, smaller charities. I hope that when Charity Navigator issues such a drastic warning from this day on, it does so based on a foundation that is a little less arbitrary, and much less capricious, than the one they used for HSUS and the other animal welfare groups involved in this court case.

Amazon Web ServicesAWS Week in Review - July 14, 2014

Let's take a quick look at what happened in AWS-land last week:

Monday, July 14
Tuesday July 15
Wednesday, July 16
Thursday, July 17
Friday, July 18

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

-- Jeff;


If developers have been looking for a different, yet strong password protection API, Passwd could be the ideal documentation for app creation. Three categories can be seen in the API site to classify password security: secure, super secure and insanely secure. Nowadays it is not enough to have the same access code for multiple sites, that is why a good combination of letters and numbers from 8 to 62 characters might work to prevent unwelcome access from hackers. The API features JSON and JSONP protocols with the parameters of name, type and values. Simple examples, and immediate support available for developers.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsFoodCare

When consumers have an ailment, they usually go to the doctor. Many times doctors recommend dietary changes in order to preserve health. FoodCare offers personalized nutrition for users from the comfort of a mobile phone. Unlike other apps, FoodCare uses a community based approach, where individuals, restaurants, schools, senior communities and hospitals can benefit from the specialized information from registered dieticians and nutrition educators. This is where developers have an ample opportunity to create apps with the FoodCare API. To get started, developers need to apply for a key. Various features in the API documentation site include cuisine types, health conditions, dishes and restaurants for a more complete application development.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsWahoo Fitness

In a time where exercise activity is connected to our mobile phones, Wahoo fitness offers an API so developers can create useful fitness software in iPhone devices with the main goal to monitor physical health. Various features of the API include multifunction capability, program navigation, economy, memory storage capacity and ease of internet upload. One of the benefits of Wahoo Fitness API is the wide variety of possibilities where developers can implement an application. It means not only the API could display physical activity in heart rate monitors, but also registers the sensors in scales, bike computers and bike trainers. Support is available for developers interested to re-shape the way fitness works either as an independent developer or as an affiliated partner with Wahoo Fitness.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsInboxApp

Inbox is a tool that compiles all mail carriers into a single easy to use web service. It streamlines the email experience by giving added control over the collection & cataloguing process. The Inbox RESTful API allows one to access, modify, and filter incoming mail from popular providers like Gmail or Microsoft Exchange. Old protocols such as IMAP or MIME are thus not a problem using Inbox API. Mailbox API uses Threads and Tags to delineate and classify objects, and is enterprise compatible.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsSkynet

Skynet is an API for the internet of things. With a collaborative approach, Skynet offers an API that helps developers to make drones dance, turn on hue light bulbs and control electronic devices like Belkin WeMo. This hybrid cloud API also offers to send and receive messages to and from devices to monitor sensor activities. The main value of Skynet is a chrome app already built in the market named NodeBlu. This app helps developers to experiment with easy dragging, dropping and writing in their own terms. Some of the features of this API include status, devices, messages, events and data. The site provides examples in REST, a demo and social media for additional support.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsImgflip

Imgflip is a SFW image generator targeted to the youth culture. In this site, teenagers can caption memes, make a gif from video, make a gif from images and even make a pie chart. Using the API, developers can reach a younger audience in their internet practices. The main value of this API is the possibility to make a paid version, even if at the moment it’s free. The paid version can occur when developers contact Imgflip directly. Imgflip API uses REST and JSON interfaces. In the documentation site, developers can see how to implement the API to get memes and to caption images with their respective examples of success and failure responses. Imgflip API can work for developers interested in the youth culture and it can make revenue when agreed by both parties involved, the Imgflip and the developer team.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: is the API in beta that developers might find beneficial to create activities in a local community. This API aims to provide an intuitive approach for consumers who tend to choose events targeted to their preferences. From the administrator perspective, the API serves as an organizer to spread the word of an event, track attendees, manage the activity and receive feedback for improvement. Developers in public relations industries who manage big events can benefit from this API, since according to the site the interface keeps concerts, marathons and shows digitally organized from beginning to end. API offers the features of user, search, event, stream, avatar and notification. Even though the site publishes worldwide events, technical support is available where developers can submit tickets for a personalized mashup creation.
Date Updated: 2014-07-21
Tags: [field_primary_category], [field_secondary_categories]

Bob DuCharme (Innodata Isogen)When did linking begin?

Pointing somewhere with a dereferenceable address, in the twelfth (or maybe fifth) century.

University of Bologna woodcut

As I have once before, I'm republishing an entry from an O'Reilly blog I had from 2003 to 2005 on topics related to linking. I've been reading up on early concepts of metadata lately—I particularly recommend Ann Blair's Too Much to Know: Managing Scholarly Information before the Modern Age—and have recently found another interesting reference to the "Regulae Iuris" book mentioned below. When I wrote this, I was more interested in hypertext issues, and if I was going to change anything to update this piece, I would change the word "traverse" to "dereference," but all the points are still meaningful.

Works about linking often claim that it's been around for thousands of years, and then they give examples that are no more than a few centuries old. I can only find one reference to something more than a thousand years old that qualifies as a link: Peter Stein's 1966 work "Regulae Iuris: from Juristic Rules to Legal Maxims" describes some late fifth-century lecture notes on a commentary by the legal scholar Ulpian. The notes mention that confirmation of a particular point can be found in the Regulae ("Rules") of the third-century Roman jurist (and student of Ulpian) Modestinus, "seventeen regulae from the end, in the regula beginning 'Dotis'...". The citation's explicit identification of the point in the cited work where the material could be found makes it the earliest link that I know of.

Other than Stein's tantalizing example, all of my research points to the 12th century as the beginning of linking. In a 1938 work on the medieval scholars of Bologna, Italy, who studied what remained of ancient Roman law, Hermann Kantorowicz wrote that in "the eleventh century...titles of law books are cited without indicating the passage, books of the Code are numbered, and the name of the law book is considered a sufficient reference." He uses this to build his argument that that a particular work described in his essay is from the eleventh century and not the twelfth, as other scholars had argued. Apparently, it was common knowledge in Kantoriwicz's field that twelfth century Bolognese scholars would reference a written law using the name of the law book, the rubric heading, and the first few words of the law itself. (Referencing of particular chapters and sections by their first few words was common at the time; the use of chapter, section, and page numbers didn't begin until the following century.)

Italian legal scholars trying to organize and make sense of the massive amounts of accumulated Roman law contributed a great deal to the mechanics of the cross-referencing that provide many of the earliest examples of linking. The medievalist husband and wife team Richard and Mary Rouse also found some in their research into evolving scholarship techniques in the great universities of England and France (that is, Oxford, Cambridge, and the Sorbonne) and they described Gilbert of Poitiers's innovative twelfth-century mechanism for addressing specific parts of his work on the psalms: he added a selection of Greek letters and other symbols down the side of each page to identify concepts such as the Penitential Psalms or the Passion and Resurrection. If you found the symbol for the Passion and Resurrection in the margin of Psalm 2 with a little 8 next to it (actually, a little "viii"—they weren't using Arabic numerals quite yet), it would tell you that the next discussion of this concept appeared in Psalm 8. Once you found the same symbol on one of the eighth psalm's pages, you might find a little "xii" with it to show that the next discussion of the same concept was in Psalm 12. This addressing system made it possible for someone preparing a sermon on the Passion and Resurrection to easily find the relevant material in the Psalms. (In fact, aids to sermon preparation was one of the main forces in the development of new research tools, as clergymen were encouraged to go out and compete with the burgeoning heretic movements for the hearts and minds of the people.)

The use of information addressing systems really got rolling in the thirteenth-century English and French universities, as scholarly monks developed concordances, subject indexes, and page numbers for both Christian religious works and the classic ancient Greek works that they learned about from their contact with the Arabic world. In fact, this is where Arabic numbers start to appear in Europe; page numbering was one of the early drivers for its adoption.

Quoting of one work by another was certainly around long before the twelfth century, but if an author doesn't identify an address for his source, his reference can't be traversed, so it's not really a link. Before the twelfth century, religious works had a long tradition of quoting and discussing other works, but in many traditions (for example, Islam, Theravada Buddhism, and Vedic Hinduism) memorization of complete religious works was so common that telling someone where to look within a work was unnecessary. If one Muslim scholar said to another "In the words of the Prophet..." he didn't need to name the sura of the Qur'an that the quoted words came from; he could assume that his listener already knew. Describing such allusions as "links" adds heft to claims that linking is thousands of years old, but a link that doesn't provide an address for its destination can't be traversed, and a link that can't be traversed isn't much of a link. And, such claims diminish the tremendous achievements of the 12th-century scholars who developed new techniques to navigate the accumulating amounts of recorded information they were studying.

Please add any comments to this Google+ post.

ProgrammableWeb: APIsXPS2PDF Web

XPS2PDF is a web-based tool for converting documents from .XPS or .OXPS format into .PDF format. The XPS2PDF Web API allows users to convert documents directly or asynchronously, get a list of recent conversion jobs, get the status of a specific job, or retrieve the quota limits and current usage for a user's account.
Date Updated: 2014-07-20
Tags: [field_primary_category], [field_secondary_categories]

Doug Schepers (Vectoreal)You’re drunk FCC, go home

I just chimed in to the FCC to request that they stop the merger of Comcast and Time-Warner Cable. I don’t know if my voice will make a difference, but I do know that saying nothing will definitely not make a difference.

Here was my statement to the FCC (flawed, I’m sure, but I hope the intent and sentiment is conveyed):

Allowing the merger of Comcast and Time-Warner Cable will dramatically decrease consumer benefits and choice.

Some mergers can be good, allowing struggling companies to reduce losses; in this case, neither Comcast nor Time-Warner Cable is in a situation that needs this merger for financial stability; both companies are currently thriving in the marketplace.

Innovation and an open market for goods and services is in the best interest of the American people. This was clearly shown when the Bell System was broken up January 8, leading to the emergence of advanced competitive services, including cellular phone service, and lower prices. The FCC should take that as a model, and decrease the monopolistic merger of competitors, which decreases this innovation, price competition, and customer choice. Customer service is already notoriously poor at both companies, and decreasing customer choice is likely to make it harder for customers to receive adequate service.

Without competition, Internet providers have little incentive to provide either improved service or lower prices. The US is already widely regarded as having relatively expensive and slow Internet service compared to other industrial nations, and this merger threatens to make that worse.

In addition to the loss of benefits to the consumer, this merger threatens American jobs. When a merger occurs, service departments also merge, and workers lose their jobs. This is especially true when the mergers are in similar industries; some studies have shown an average of 19% job loss, far above the norm of 7.9% when the industries are unrelated. Comcast currently employs 136,000 people; Time-Warner Cable currently employs 51,600 people; if the average job loss takes place, that could mean approximately 35,644 jobs lost, or more conservatively 14,820 jobs, in a still-struggling employment market; many of these will be unskilled labor, which is even harder to resolve. While no laws in the US take into account the effect of job loss on mergers, this is still a factor that can be taken into account by the FCC; laws are only necessary when systemic problems arise in the behavior of key industry players and regulators, and allowing this merger could necessitate the creation of a law that would otherwise be avoided.

Please take the necessary steps to block this merger.

If you are a US citizen, you have until August 25th, 2014 to file a comment. The FCC seems to have gone out of its way to make this difficult, so here are some step-by-step instructions:

  1. Fill out the Free Press petition first just in case. Then, if you want to register your opposition independently…
  2. Go to the FCC  Electronic Comment Filing System page
  3. Enter “14-57” in the Proceeding Number field; you’ll get no immediate confirmation, but this is the code for the “Applications of Comcast Corporation and Time Warner Cable Inc. for Consent to Assign or Transfer Control of Licenses and Applications”. (Note: this is not arcane at all. That’s just an illusion.)
  4. Fill in all required personal information
  5. Ensure that the Type of Filing field is set to “Comment” (the default)
  6. Write a text document explaining why this is such a bad idea; crib mine if you like, or find a much better rationale, but be sure to be clear in your opposition (or support, if you’re a masochist).
  7. Upload your document using the Choose File button. (That’s right, you can’t just leave a comment in a text area, you have to write a separate document. The FCC seems to accept at least .txt and .doc files.) Add your optional description of the file in the Custom Description, so they know your sentiment even if they don’t open your file (which is pretty likely); I labeled mine “Block Comcast-TWC merger”.

Yay! You live in an arguably democratic country!

Amazon Web ServicesAWS Support API Update - Attachment Creation and Lightweight Monitoring

The AWS Support API provides you with a programmatic access to your support cases and to the AWS Trusted Advisor . Today we are extending the API in order to give you more control over the cases that you create and a new, lightweight way to access information about your cases. The examples in this post make use of the AWS SDK for Java .

Creating Attachments for Support Cases
When you create a support case, you may want to include additional information along with the case. Perhaps you want to attach some sample code, a protocol trace, or some screen shots. With today's release you can now create, manage, and use attachments programmatically.

My colleague Kapil Sharma provided me with some sample Java code to show you how to do this. Let's walk through it. The first step is to create an Attachment from a file (File1 in this case):

Attachment attachment = new Attachment;
Attachment.setData(ByteBuffer.wrap(File.readAllBytes(FileSystems.getDefault().getPath("", "File1"))));

Then you create a List of the attachments for the case:

List<Attachment> attachmentSet = New ArrayList<Attachment>();

And upload the attachments:

AddAttachmentsToSetRequest addAttachmentsToSetRequest = new AddAttachmentsToSetRequest();
AddAttachmentsToSetResult addAttachmentsToSetResult = client.addAttachmentsToSet(addAttachmentsToSetRequest);

With the attachment or attachments uploaded, you next need to get an Id for the set:

String attachmentSetId = addAttachmentsToSetResult.getAttachmentSetId();

And then you are ready to create the actual support case:

CreateCaseRequest request = new CreateCaseRequest()

CreateCaseResult result = client.createCase(request);

Once you have created a support case or two, you probably want to check on their status. The describeCases function lets you do just that. In the past, this function returned a detailed response that included up to 15 MB of attachments. With today's enhancement, you can now ask for a lightweight response that does not include any attachments. If you are calling describeCases to check for changes in status, you can now do this in a more efficient fashion.

DescribeCaseRequest request = new DescribeCaseRequest();


To learn more about creating and managing cases programmatically, take a look at Programming the Life of an AWS Support Case.

Available Now
The new functionality described in this post is available now and you can start using them today! The SDK for PHP, SDK for .NET, SDK for Ruby, SDK for Java, SDK for JavaScript in the Browser, and the AWS Command Line Interface have been updated.

-- Jeff;

Jeremy Keith (Adactio)Overcast and Huffduffer

Marco Arment has released his podcast app Overcast for iOS—you can read his introduction to the app.

It plays nicely with Huffduffer. If you want to listen to any Huffduffer feed in Overcast, it’s a straightforward process.

Step 1

<figure> Overcast Add podcast <figcaption>Launch the app and tap on “add a podcast”. Then tap on “Add URL” in the top right.</figcaption> </figure>

Step 2

<figure> Add URL Huffduffer URL <figcaption>Enter the Huffduffer URL e.g.</figcaption> </figure>

Step 3

<figure> All Podcast episode <figcaption>That’s it! You’re all set.</figcaption> </figure>

It’s pretty straightforward to subscribe to Huffduffer URLs in other iOS apps too. Matt has written up the process of using Huffduffer and Instacast. And there’s also a write-up of using Huffduffer and Downcast.

Jeremy Keith (Adactio)Cory Doctorow at dConstruct 2014

This year’s dConstruct will be the tenth one. Ten! That’s a cause for celebration.

The very first dConstruct back in 2005 was a small affair. But we pulled off something off a coup by having the one and only Cory Doctorow deliver the closing talk. You can still listen to the talk—along with every dConstruct talk ever—at the dConstruct archive.

Cory Doctorow

It’s a great talk that still holds up a decade later. Cory’s passion for freedom and technology (and maintaining the intersection of both) is palpable.

Here we are in 2014 and the theme for dConstruct is “Living With The Network.” Who better to deliver a keynote address than Cory Doctorow?

That’s right—he’s back!

Cory Doctorow

I love the symmetry of having Cory at the first and the tenth dConstruct. Also: he’s absolutely bloody brilliant!

Get your ticket for dConstruct 2014 now. It’s going to be a magnificent day.

See you on September 5th!

Doug Schepers (Vectoreal)A Life in a Day, in 2024

I woke up startled; my glasses were ringing. I was late for a telcon… again. I’d stayed up working too late last night.

I slipped on my glasses and answered the call. Several faces popped up a few feet in front of my eyes. Okay, so it was a videocon… sigh. I muted and blanked my glasses, switched them to speakerphone, and placed them on the table, the lenses vibrating as speakers. I pulled on some clothes and rubbed my face awake, trotting into the bathroom with my glasses in my left hand. As I splashed some water on my face, I heard my name called from my glasses on the counter; “Doug, did you get in contact with them?”

“Specs, delay,” I told my glasses, and my phone agent told the other participants, politely, “Please wait for 10 seconds for a response.”

Drying my face quickly on a towel, I put my glasses back on, looked into the mirror, unblanked the camera and unmuted the mic, and replied, “Hey, folks, yes, sorry about that. I did talk to them, and they are pretty receptive to the idea. They have their own requirements, of course, but I’m confident that we can fold those into our own.” I noticed my own face in the display, broadcast from my camera’s view of my reflection in the mirror, and hastily straightened out my sleepy hair.

A few minutes later, when the topic had changed, I opened a link that someone dropped into the call, and started reading the document in the glasses’ display. With the limited space available, I scanned it in RSVP (rapid serial visual presentation) mode, but quickly found it too distracting from the simultaneous conversation, requiring too much concentration. So I muted and blanked again, and walked down the hall to my office. Ensconced in front of my big screen, I re-routed the call to use the screen’s videocamera and display.

On the screen, it was easier to scan the document at my leisure. I could easily shift my focus back to the conversation when needed, without losing my place in the document. I casually highlighted a few passages to follow up on later, and made a few notes. I did the same with another document linked from the telcon, and my browser told me that this page was linked to from a document I’d annotated several months before. I marked it to read and correlate my notes in depth after the call. One thing that stood out immediately was that both documents mentioned a particular book; I was pretty sure I’d bought a physical copy a couple of years before, and I stepped over to my bookshelves. I set my glasses camera on auto-scan, looking for the title via OCR, and on the third set of shelves, my glasses blinked on a particular book; sure enough, I had a copy. I guess I could have simply ordered a digital version, but I already the physical edition handy, and sometimes I preferred having a real book in my hands.

My stomach started grumbling before the call ended. I decided to go out to lunch. Throwing the book and one of my tablets into my bag, I asked my glasses to pick a restaurant for me. It scanned the list of my favorites, and looked also for new restaurants with vegetarian food, looking for a nice balance between distance, ratings, and number of current patrons. “I’ve found a new food truck, with Indian-Mexican fusion. It’s rated 4.5, and there are several vegetarian options. Dave Cowles is eating there now. It’s a 7-minute drive. Is that okay, or should I keep looking?”

“Nope, sounds great. Call Dave, would you?” A map popped up, giving me an overview of the location, then faded away until it was needed. A symbol also popped up, indicating that my call to Dave had connected, on a private peer session.

“Hey, Doug, what’s up?”

“I was thinking of going to that food truck…”, I glanced up and to the right, and my glasses interpreted my eye gesture as a request for more context information, displaying the name of the restaurant, “… Curry Favor. You’re there now, right? Any good?”

“I just got here myself. Want me to stick around?”

“Yeah, I’ll be there in about 10 minutes.” I headed out the door, and unhooked my car’s charger before I jumped in. My glasses showed the next upcoming direction, and the car infographics; the car had a full charge. “Music”, I said, as I drove off; my car interface picked a playlist for me, a mix of my favorites I hadn’t heard in a while, and some new streaming stuff my music service thought I would like. As I got out of range of my house’s wifi, my glasses switched seamlessly to the car’s wifi. It was an easy drive, with my glasses displaying the optimal route and anticipating shifting traffic patterns and lights, but I still thought how nice it would be to buy one of the self-piloted cars. My car knew my destination from my glasses, and it alerted me that a parking spot had just opened up very near the food truck, so I confirmed and it reserved the spot; I’d pay an extra 50¢ to hold the spot until I arrived, but it was well worth it. My glasses read out the veggie menu options out loud on demand, and I chose the naan burrito with palak paneer and chick peas; my glasses placed my order in advance via text.

I pulled into my parking space, and my glasses blinked an icon telling me the sub-street sensor had registered my car’s presence. Great parking spot… I was right across the street from the food truck. I walked over to the benches where Dave sat. “Hey, Dave.”

We exchanged a few words, but my glasses told me my order was ready in a flash. I went to the window, and picked up my burrito; the account total came up in my view, and I picked a currency to pay it; I preferred to use my credit union’s digital currency, and was glad when the food truck’s agent accepted it. “Thanks, man,” I smiled at the cashier.

Dave and I hadn’t seen each other in a while, and we caught up over lunch. It turned out he was working on a cool new mapping project, and I drilled him for some details; it wasn’t my field, but it was interesting, and you never knew when a passing familiarity might come in handy. With his okay, my glasses recorded part of our conversation so I could make more detailed notes, and his glasses sent me some links to look at later. We finished our food quickly –it was tasty, so I left a quick positive review– and walked to a nearby coffee shop to continue the conversation. While we were talking, Dave recommended an app that I bought, and I also bought a song from the coffee shop that caught my ear from their in-house audio stream; Dave and the coffee shop each got a percentage of the sale. I learned that the coffee shop got an even bigger share of the song, because the musician had played at their shop and they’d negotiated a private contract, in exchange for promotion of her tour, which popped up in my display; that was okay, I liked supporting local businesses, and I filed away the tour dates in my calendar in case it was convenient for me to go to the show.

Dave went back to work, and I settled into the coffee shop to do some reading. First I read some of the book I’d brought, making sure to quickly glasses-scan the barcode first so I could keep a log; I found several good pieces of information, which I highlighted and commented on; my glasses tracked my gaze to OCR the text for storage and anchoring, and I subvocalized the notes. I then followed up on the links from earlier; my agent had earned its rate, having found several important correlations between the documents and my notes, as well as highly-reputed annotations from others on various annotation repos, and I thought more about next steps. I followed a few quick links to solidify my intuition, but on one link, I got stopped abruptly at an ad-wall; for whatever reason, this site insisted I watch a 15-second video rather than just opting-in to a deci-cent micropayment, as I usually did when browsing. I tolerated the video –unfortunately, if I took my glasses off while it played, the ad would know– only to find that the whole site was ad-based… intolerable, so I did some keyword searching to find an alternate site for the information.

Light reading and browsing was fine in a public place, but to get any real work done, I needed privacy. I strolled back to my car –my glasses reminding me where I’d parked– and I returned home. Back in my office, I put on some light music, and started coding. I started with a classic HTML-CSS-SVG-JS doc-app component framework on my local box, because I was old-school, and went mobile from there, adding annotated categories to words and phrases for meaning-extraction, customizing the triple-referenced structured API, dragging in a knowledge-base and vocabulary for the speech interface and translation engine, and establishing variable-usage contract terms with service providers (trying to optimize for low-cost usage when possible, and tweaking so the app would automatically switch service providers before it hit the next payment threshold… I’m cheap, and most of my users are too). I didn’t worry much about tweaking the good-enough library-default UI, since most users would barely or rarely see any layout, but rather would interact with the app through voice commands and questions, and see only microviews; I paid more attention to making sure that the agents would be able to correctly index and correlate the features and facts. Just as I was careful to separate style from content, I was careful to separate semantics from content. At some point, I reflected, AIs would get powerful enough so that information workers wouldn’t have such an easy time making a living; I wondered if we’d even need markup or APIs or standards at all, or if the whole infrastructure would be dynamic and ad-hoc. Maybe the work I was doing was making me obsolete. “‘Tis a consummation devoutly to be wished,” I thought to myself wryly.

I put the finishing touches on the app prototype, wrote some final tests, and ran through a manual scenario walk-through to pass the time while the test framework really put the app through its paces, spawning a few hundred thousand virtual unique concurrent users. Other than a few glitches to be polished up, it seemed to work well. I was pretty proud of this work; the app gave me real-time civic feedback, including drill-down visualization, on public policy statements, trawling news sites, social networks, and annotation servers for sentiment and fact-checking; it balanced opinion with cost-benefit risk-scenarios weighted by credibility and likelihood, and managed it all with voting records of representatives. It also tracked influence, either by lobbying or donations or inferred connections, and correlated company ownership chains and investments, to give a picture on who was pushing who’s buttons, and it would work equally well for boycotting products based on company profiles as it would on holding politicians accountable. As part of the ClearGov Foundation’s online voting system, it stood a chance of reforming government, though it was getting more adoption in South America and Africa than it was in the US so far. Patience, patience…

Megan came home from work with dinner from a locavore kitchen; the front door camera alerted me to her approach, and I saw she had her hands full. “Open front door,” I told the house as I rose to help her. We ate in front of the wallscreen, watching some static, non-interactive comedy streams; we were both too tired to “play-along” with plots, character POV, or camera angles, and it wasn’t really our style anyway. I hadn’t gotten enough rest the night before, so I turned in early to read; the mattress turned off the bedside light when it sensed my heart-rate and breathing slow into sleep.

Note: This story of the Web and life in 2024 is clearly fictional; nobody would hire someone who’d worked in web standards to do real programming work.

ProgrammableWeb: APIsPemilu Berita

The Pemilu Berita API provides a curated news feed for the 2014 Indonesian elections. Users can retrieve a list of recent posts, posts matching a given search term, posts with a given tag, or a single post. The Berita API is part of API Pemilu, a collection of APIs that provide Indonesia's approximately 187 million voters with important election information.
Date Updated: 2014-07-16
Tags: [field_primary_category], [field_secondary_categories]

Amazon Web ServicesAmazon RDS PostgreSQL Update - Service Level Agreement and General Availability

We launched RDS PostgreSQL at AWS re:Invent in November of 2013 in order to bring the benefits of a managed database service to the PostgreSQL community.

Customers of all sizes are bringing their mission-critical PostgreSQL workloads to RDS at a rapid clip. Here is a small sample of the applications that have been launched on top of RDS PostgreSQL in the past 7 months:

Our RDS partners ( ESRI, Boundless, and Jaspersoft to name a few) are helping their customers to take advantage of the power of RDS PostgreSQL.

SLA and General Availability
We have experienced strong customer adoption and have accumulated plenty of operational experience since the beta launch and are happy to announce that effective July 1, 2014 RDS PostgreSQL will be included in the Amazon RDS SLA and that it will be generally available.

The SLA provides for 99.95% availability for Multi-AZ database instances on a monthly basis. If availability falls below 99.95% for a Multi-AZ database instance (which is a maximum of 22 minutes of unavailability for each database instance per month), you will be eligible to receive service credits. The Amazon RDS SLA is designed to give you additional confidence to run the most demanding and mission critical workloads dependably in the AWS cloud.

Let's take a look at some of the ways that our customers are making use of RDS for PostgreSQL!

6Wunderkinder - Macstore App of the Year on RDS PostgreSQL
6Wunderkinder is the creator of Wunderlist, a cross-platform task management application and the winner of the Macstore app of the year for 2013. We spoke to Chad Fowler (CTO) to learn more about their use of RDS. This is what he had to say:

We have millions of connected and constantly polling clients. This generates massive amounts of usage data in our operational data stores in PostgreSQL. We were able to handoff the database management aspects to RDS and leveraged the beefiest 244GB instance to handle our production workload.

With Provisioned IOPS we were able to achieve the I/O throughput demanded by our application. Enabling Multi-AZ has given us the peace of mind to rely on RDS for high-availability and focus on what we do best - build a robust task management application.

Netflix - Open Source Security on RDS PostgreSQL
Online content provider Netflix is able to support seamless global service by partnering with AWS for services and delivery of content. Cloud Security Architect Jason Chan provided some perspective on their use of RDS for PostgreSQL:

We recently announced the open source availability of Security Monkey, our solution for monitoring and analyzing the security of our Amazon Web Services configurations. We leveraged RDS PostgreSQL to capture the security data gathered by our solution.

Building an application backed by production ready PostgreSQL database with Multi-AZ high availability, automated backups, patching and upgrades handled by RDS helped us focus on our development to bring this powerful open source solution to the community.

Illumina - Building Global Scale Applications on RDS PostgreSQL
Illumina is a leading developer, manufacturer and marketer of life science tools and integrated systems for large-scale analysis of genetic variation and function. Greg Roberts, Development Lead for the BaseSpaces product, talked about RDS PostgreSQL:

Using BaseSpace, biologists and informaticians can easily and securely analyze, archive and share sequencing data. We were able to leverage RDS for PostgreSQL since first day of launch for our BaseSpace project. We scaled our instances seamlessly as our project grew.

Automated backups and automated failovers with Multi-AZ provided us the high availability and protection against data loss that our customers expect.

TigerLogic - Social Media Analytics at Postano on RDS PostgreSQL
TigerLogic Corporation has been helping companies make better use of their data for more than three decades. Their Postano Platform helps leading brands to create engaging social visualizations. Danny Hyun, Director of Engineering for TigerLogic had a lot to say about RDS for PostgreSQL:

The Postano Platform helps leading brands create highly engaging social visualizations leveraging the top social networks in use today. Working with RDS PostgreSQL since its launch gave us the confidence we could build a scalable, robust solution for our customers without the hassles of dealing with database management. The Platform processes more than a million social messages daily, and during special occasions - like national and global award shows and sporting events - more than 10 million messages will be stored and read in our RDS PostgreSQL instance in a very short period of time. Leveraging the abilities of the most powerful RDS instance, cr1.8xlarge with Multi-AZ capabilities, we were able to bring our production database workloads onto RDS early on.

Get Started Today
You can get started with Amazon RDS for PostgreSQL today. You can launch an intance with a couple of clicks in the AWS Management Console . To learn more, visit the RDS for PostgreSQL page or the RDS Documentation.

-- Jeff;

Amazon Web ServicesNew Annual Pricing for AWS Marketplace Products

I'm writing today to tell you about an important new feature for AWS Marketplace. As you may know, the AWS Marketplace lets you find, buy, and immediately start using a wide variety of software for developers and enterprises — 26 categories that span infrastructure, developer tools, and business applications.

You can now purchase AWS Marketplace products on an annual subscription basis. You will make a single upfront payment by selecting the annual option and an Amazon Elastic Compute Cloud instance type and have unlimited use of the software for that EC2 instance for the next 12 months.

Annual subscriptions are be available for more than 90 software products from ISVs such as Alert Logic, Barracuda, Citrix, Fortinet, MicroStrategy, Progress Software, Riverbed, Sophos, Tenable, and Vormetric.

You can purchase an annual subscription with a couple of clicks:

  • Find the desired product, note the annual subscription price, and click Continue.
  • Specify the number of subscriptions that you would like to buy.
  • Click Accept Terms & Launch.

Benefits of Annual Pricing
This new option provides you with several important benefits:

  • Predictability - The annual pricing model will allow you to make more accurate forecasts of your software expenses when you are running steady-state workloads.
  • Cost Savings - You can reduce your software costs by 10% to 40% or more when you purchase an annual subscription for select software products on AWS Marketplace. You can continue to pay for usage on an hourly basis in situations where your workload is bursty.
  • Flexibility - You can run the software in any Availability Zone of any AWS Region in which the software is offered. The purchase is, however, specific to a particular EC2 instance type.
  • Ease of Use - You can change your software pricing model without restarting any instances or re-launching any applications.
  • Uniformity - As is the case with hourly pricing, all annual subscription charges will appear on your AWS bill. You don't have to set up any new accounts or share the payment information with a third party.

The Details
If you are currently paying for a product on an hourly basis, you can convert to an annual subscription pricing by simply buying annual subscription(s) as needed. You do not need to restart the instance or re-launch the application.

Let's walk through the process of purchasing a product through AWS Marketplace. The first step is to search for the product you are interested in. In this case I am lookingfor the Sophos UTM 9 security gateway:

Then I review the pricing details:

And make the purchase:

There will be a short wait while we spin up the EC2 instance, deploy the software and purchase your annual subscription:

Once the software is running, you can click the Usage Instructions button to check on the next steps. If you want to add, cancel or change a subscription you can always go to the Your Software Subscriptions page to manage your software:

If you are an ISV and your software is already in AWS Marketplace, you can offer an annual subscription by submitting annual prices for your product.

Available Now
As I mentioned earlier, annual pricing is available in AWS Marketplace now!

-- Jeff;

ProgrammableWeb: APIsBitcointy

Bitcointy is a unaffiliated API that displays the value of a bitcoin in the majority of world’s currencies. As the trends change, developers might find practical to create an application of value exchange with upcoming forms of online transactions. This API supports JSON format. Developers in the new finance industry can benefit from this API because the documentation allows to get currency, average price and visualized data.
Date Updated: 2014-07-15
Tags: [field_primary_category], [field_secondary_categories]

Amazon Web ServicesAmazon SNS TTL (Time to Live) Control

Amazon Simple Notification Service is a fast and flexible push messaging service.

Many of the messages that you can send with SNS are relevant or valuable for a limited period of time. Sports scores, weather notifications, and "flash sale" announcements can all get stale in a short period of time. In situations where devices are offline or disconnected, flooding the user with outdated messages when communication is reestablished makes for a poor user experience.

In order to allow you to build great applications that behave well in an environment with real-time information and intermittent connections, Amazon SNS now allows you to set a TTL (Time to Live) value of up to two weeks for each message. Messages that remain undelivered for the given period of time (expressed as a number of seconds since the message was published) will expire and will not be delivered. Read about How to Use Time to Live Values.

Most of the underlying push services support TTL in some fashion, but each one uses a unique set of APIs and data formats. With today's release, SNS now lets you use a common format and the cross-platform Publish API to define TTL values for iOS, Android, Fire OS, Windows WNS, and Baidu endpoints (Windows MPNS does not support TTL)

You can set the TTL through the SNS API or the AWS Management Console:

This new feature, in conjunction with our recent launch of mobile push support for Windows (phone and desktop) and Baidu Cloud Push (Android), will help you to build helpful, user-friendly applications that reach a broad user base without having to deal with a multitude of messaging providers.

-- Jeff;

Amazon Web ServicesStore and Monitor OS &amp; Application Log Files with Amazon CloudWatch

When you move from a static operating environment to a dynamically scaled, cloud-powered environment, you need to take a fresh look at your model for capturing, storing, and analyzing the log files produced by your operating system and your applications. Because instances come and go, storing them locally for the long term is simply not appropriate. When running at scale, simply finding storage space for new log files and managing expiration of older ones can become a chore. Further, there's often actionable information buried within those files. Failures, even if they are one in a million or one in a billion, represent opportunities to increase the reliability of your system and to improve the customer experience.

Today we are introducing a powerful new log storage and monitoring feature for Amazon CloudWatch. You can now route your operating system, application, and custom log files to CloudWatch, where they will be stored in durable fashion for as long as you'd like. You can also configure CloudWatch to monitor the incoming log entries for any desired symbols or messages and to surface the results as CloudWatch metrics. You could, for example, monitor your web server's log files for 404 errors to detect bad inbound links or 503 errors to detect a possible overload condition. You could monitor your Linux server log files to detect resource depletion issues such as a lack of swap space or file descriptors. You can even use the metrics to raise alarms or to initiate Auto Scaling activities.

Vocabulary Lesson
Before we dig any deeper, let's agree on some basic terminology! Here are some new terms that you will need to understand in order to use CloudWatch to store and monitor your logs:

  • Log Event - A Log Event is an activity recorded by the application or resource being monitored. It contains a timestamp and raw message data in UTF-8 form.
  • Log Stream - A Log Stream is a sequence of Log Events from the same source (a particular application instance or resource).
  • Log Group - A Log Group is a group of Log Streams that share the same properties, policies, and access controls.
  • Metric Filters - The Metric Filters tell CloudWatch how to extract metric observations from ingested events and turn them in to CloudWatch metrics.
  • Retention Policies - The Retention Policies determine how long events are retained. Policies are assigned to Log Groups and apply to all of the Log Streams in the group.
  • Log Agent - You can install CloudWatch Log Agents on your EC2 instances and direct them to store Log Events in CloudWatch. The Agent has been tested on the Amazon Linux AMIs and the Ubuntu AMIs. If you are running Microsoft Windows, you can configure the ec2config service on your instance to send systems logs to CloudWatch. To learn more about this option, read the documentation on Configuring a Windows Instance Using the EC2Config Service.

Getting Started With CloudWatch Logs
In order to learn more about CloudWatch Logs, I installed the CloudWatch Log Agent on the EC2 instance that I am using to write this blog post! I started by downloading the install script:

$ wget

Then I created an IAM user using the policy document provided in the documentation and saved the credentials:

I ran the installation script. The script downloaded, installed, and configured the AWS CLI for me (including a prompt for AWS credentials for my IAM user), and then walked me through the process of configuring the Log Agent to capture Log Events from the /var/log/messages and /var/log/secure files on the instance:

Path of log file to upload [/var/log/messages]: 
Destination Log Group name [/var/log/messages]: 

Choose Log Stream name:
  1. Use EC2 instance id.
  2. Use hostname.
  3. Custom.
Enter choice [1]: 

Choose Log Event timestamp format:
  1. %b %d %H:%M:%S    (Dec 31 23:59:59)
  2. %d/%b/%Y:%H:%M:%S (10/Oct/2000:13:55:36)
  3. %Y-%m-%d %H:%M:%S (2008-09-08 11:52:54)
  4. Custom
Enter choice [1]: 1

Choose initial position of upload:
  1. From start of file.
  2. From end of file.
Enter choice [1]: 1

The Log Groups were visible in the AWS Management Console a few minutes later:

Since I installed the Log Agent on a single EC2 instance, each Log Group contained a single Log Stream. As I specified when I installed the Log Agent, the instance id was used to name the stream:

The Log Stream for /var/log/secure was visible with another click:

I decided to track the "Invalid user" messages so that I could see how often spurious login attempts were made on my instance. I returned to the list of Log Groups, selected the stream, and clicked on Create Metric Filter. Then I created a filter that would look for the string "Invalid user" (the patterns are case-sensitive):

As you can see, the console allowed me to test potential filter patterns against actual log data. When I inspected the results, I realized that a single login attempt would generate several entries in the log file. I was fine with this, and stepped ahead, named the filter and mapped it to a CloudWatch namespace and metric:

I also created an alarm to send me an email heads-up if the number of invalid login attempts grows to a suspiciously high level:

With the logging and the alarm in place, I fired off a volley of spurious login attempts from another EC2 instance and waited for the alarm to fire, as expected:

I also have control over the retention period for each Log Group. As you can see, logs can be retained forever (see my notes on Pricing and Availability to learn more about the cost associated with doing this):

Elastic Beanstalk and CloudWatch Logs
You can also generate CloudWatch Logs from your Elastic Beanstalk applications. To get you going with a running start, we have created a sample configuration file that you can copy to the .ebextensions directory at the root of your application. You can find the files at the following locations:

Place in the folder, then build and deploy your application as normal. Click on the Monitoring tab in the Elastic Beanstalk Console, and then press the Edit button to locate the new resource and select it for monitoring and graphing:

Add the desired statistic, and Elastic Beanstalk will display the graph:

To learn more, read about Using AWS Elastic Beanstalk with Amazon CloudWatch Logs.

Other Logging Options
You can push log data to CloudWatch from AWS OpsWorks, or through the CloudWatch APIs. You can also configure and use logs using AWS CloudFormation .

In a new post on the AWS Application Management Blog, Using Amazon CloudWatch Logs with AWS OpsWorks, my colleague Chris Barclay shows you how to use Chef recipes to create a scalable, centralized logging solution with nothing more than a couple of simple recipes.

To learn more about configuring and using CloudWatch Logs and Metrics Filters through CloudFormation, take a look at the Amazon CloudWatch Logs Sample. Here's an excerpt from the template:

"404MetricFilter": {
    "Type": "AWS::Logs::MetricFilter",
    "Properties": {
        "LogGroupName": {
            "Ref": "WebServerLogGroup"
        "FilterPattern": "[ip, identity, user_id, timestamp, request, status_code = 404, size, ...]",
        "MetricTransformations": [
                "MetricValue": "1",
                "MetricNamespace": "test/404s",
                "MetricName": "test404Count"

Your code can push a single Log Event to a Long Stream using the putLogEvents function. Here's a PHP snippet to get you started:

$result = $client->putLogEvents(array(
    'logGroupName'  => 'AppLog,
    'logStreamName' => 'ThisInstance',
    'logEvents'     => array(
            'timestamp' => time(),
            'message'   => 'Click!',
    'sequenceToken' => 'string',

Pricing and Availability
This new feature is available now in US East (Northern Virginia) Region and you can start using it today.

Pricing is based on the volume of Log Entries that you store and how long you choose to retain them. For more information, please take a look at the CloudWatch Pricing page. Log Events are stored in compressed fashion to reduce storage charges; there is 26 bytes of storage overhead per Log Event.

-- Jeff;

Amazon Web ServicesAWS Week in Review - July 7, 2014

Let's take a quick look at what happened in AWS-land last week:

Monday, July 7
Tuesday, July 8
Wednesday, July 9
Thursday, July 10
Friday, July 11

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

-- Jeff;

ProgrammableWeb: APIsVersatility Werks

VersCaptcha is a convenient tool that prevents online spam using original captcha icons. It’s also a service that prevents bots from visiting web pages. With this features, there will be a noticeable difference in spam reduction. No more unwanted junk e-mail. Web designers, webmasters and web developers can benefit from the VersCaptcha API that offers captcha of images, icons and questions. To get the API key, developers need to be signed up in the system. Once they are logged in, they will see how this particular API is about spam prevention for mobile applications. They will be able to download the files in PHP. The main features of VersCaptcha API are security, integration, custom icons custom question and custom CSS. The site updates periodically through social media under the name of Versatility Werks.
Date Updated: 2014-07-14
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOpenEd

OpenEd is an open source e-learning management system for K-12 students. What makes this site different is the flipped-classroom approach where students can collaborate in a more flexible environment. OpenEd focuses in Common Core resources for educational games, video lessons, assessments and courses. The API integrates Common Core standards in a platform that can be opened from apps and websites. This is particularly useful for developers who work in school districts and educational companies. In the documentation site, they can find information about school standards, management of taxonomies and foundations of e-learning success.
Date Updated: 2014-07-14
Tags: [field_primary_category], [field_secondary_categories]

Jeremy Keith (Adactio)The tide

Thank you to everyone who has donated—or is going to donate—to the Oregon Humane Society in Chloe’s memory.

Thank you to everyone who sent me words of comfort. I really, really appreciate it.

I’ve been surprised by where else I’ve found comfort:

Tomorrow is Monday, the start of the working week. Tomorrow I will go to work. Tomorrow I will, to all outward appearances, carry on as normal.

Except it won’t really be normal. It’s going to be very strange. The world feels very strange to me. A world without Chloe isn’t right. It isn’t normal. A world without Chloe feels wrong. Skewed. Off-kilter.

But I’m going to go into work. I’m going to do some hacking. I’m going to write about code. I’m going to post links related to web design and development. I’m going to get back to organising this year’s dConstruct. (Can you believe that the last time I was IMing with Chloe, I was bitching to her about lacklustre ticket sales? What a fucking joke.)

In short, I’m going to carry on. Even though the world feels wrong. I’m not sure if the world will ever feel right again.

I thought that grief was like a tsunami. It’s unstoppable. It washes over you completely. It flattens you and leaves you battered and bruised. But then it’s over, right?

It turns out that grief is more like the tide. The tsunami was just the first wave. There will be many more.

Over the course of a single day, many a wave will hit me unexpectedly and I’ll find myself weeping …again. Over time, those waves will abate. But grief is fractally tidal. There are longer waves—days, weeks, months, years.

Remy has endured four years of grief and counting:

Time won’t ever heal this hole in our lives. It shouldn’t either.

But he carries on, even though the world is wrong:

You just get stronger. You have to.

It doesn’t mean it doesn’t hurt anymore. It does. I’m just able to carry that pain and make it mine and part of me, because I’ve learnt how to.

Time doesn’t heal. It just looks that way from the outside in.

So tomorrow I’ll go back to work, and I’ll go back to writing and coding and talking and organising. Perhaps those activities might provide their own comfort.

I know that the tide will never stop, but I hope that it will at least weaken in strength over time.

A world without Chloe is wrong, but that’s the world I live in now. There won’t be a day goes by that I won’t be thinking of her.

Jeremy Keith (Adactio)For Chloe

We all grieve in different ways. We all find solace and comfort in different places.

There can be solace in walking. There can be comfort in music. Tears. Rage. Sadness. Whatever it takes.

Personally, I have found comfort in reading what others have written about Chloe …but I know Chloe would be really embarrassed. She never liked getting attention.

Chloe must have known that people would want to commemorate her in some way. She didn’t want a big ceremony. She didn’t want any fuss. She left specific instructions (her suicide was not a spur-of-the moment decision).

If you would like to mourn the death—and celebrate the life—of Chloe Weil, she asked that you contribute to one or both of these institutions:

  1. The Oregon Humane Society. This is where Chloe found FACE, her constant companion.
  2. The Internet Archive. Chloe cared deeply about the web and digital preservation.

If you choose to make a donation; thank you. It’s what Chloe wanted.

I still can’t believe she’s gone.

Norman Walsh (Sun)Back to Ubuntu

<article class="essay" id="content" lang="en">

The best QA wins.

The abstract above is perhaps a little glib. After four or so months with KDE, our hero has moved back to straight Ubuntu even with Unity.

The move to KDE was motivated mostly by the fact that it (the Kubuntu flavor of Ubuntu, specifically) was the first installer that seemed to work reliably on this laptop.

Trouble is, I found all sorts of minor annoyances about KDE and they eventually took their toll.

  • KDE would not remember that I wanted the trackpad off. I had to run synclient TouchpadOff=1 after every reboot.

  • Everytime I connected an external monitor, I'd get two display setting widgets. One worked, the other I just had to dismiss.

  • For some reason, VMWare would not remember settings. I had to enable shared folders by hand in every VM every time.

  • KDE has two notions of working environments, multiple desktops and…something else. I never got the hang of the other thing.

There are probably a few other things. Most of them fixable, I'm sure. I understand KDE is very configurable, but I never “got into it.” No disrespect intended to my KDE loving brethren.

So I tried a Unity-flavored 14.04 install the other day. Between the NVIDIA drivers and the nox2apic kernel option, it all seems just as fine and stable as the KDE flavor.

Discovering that Super-Tab pops up the Unity “type your command here” box a la Quicksilver, Alfred, and Gnome Do took away a big chunk of my Unity hate. Now I can just hide it and ignore it mostly.

I'm sure I'll be annoyed by this desktop too in time. I've been annoyed by all the others. But for today, I'm calling it an improvement.


ProgrammableWeb: APIsSoshio

Developers can integrate the Soshio Chinese Social Media Intelligence API into customer relationship management, 3rd party applications, and social interactions to create an intelligent monitoring system. It allows content creators the ability to integrate the Chinese language more seamlessly into posts, social interaction, and implementation within popular CRM systems. This API is great for consumer goods, financial services, marketing agencies, non-profit organizations, and more who want to use big data to identify and track customers on the most popular Chinese social networks.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsKeyLemon

Developers can use KeyLemon’s Face Recognition API to integrate facial recognition into any web service. It accept features, gender, head pose, and more attributes for facial identification that can be used for online authentication. KeyLemon offers a complete documentation of wrappers libraries to expedite the development and integration process. KeyLemon also offers an SDK compatible with Mac & Windows.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsCara

Cara is a face detection software that turns anything with a camera into an intelligent sensor to store data on the content’s viewer. It measures glances, distance, gender, timing, age, and much more while at the same time respecting privacy by not storing pictures or personal information. This robust audience analytics tool is compatible with Windows, Android, Linux & Mac OSX and is manufactured by IMRSV (formerly Immersive Labs). The Cara Cloud RESTful API provides server side image processing, and is a feature included within the Cara developer tools, along with a Devkit and a Local SDK.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsNoldus FaceReader

The FaceReader API by Noldus analyzes emotions and allows integration with 3rd party software to respond to the results. Used in the application of surveys, this can be a good application for experiment and research. Using the Project Analysis Module, developers can trigger event markers that launch other applications, or ask their observer questions in response to their emotive triggers. The updated API allows for detailed coordinate and image export.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Avatar Framework

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. The Avatar Framework API allows developers to create customizable avatars to add to their online community to use on blogs, messages, personal profiles, and throughout the site in general. Users can upload unique avatar photos and edit them with a variety of controls. The API also allows for recording audio, exporting to use in other web communities, and even the ability for monetizing and selling an avatar’s inventory. In general, this framework is a large package with a variety of functionalities for online communities. The administration platform as a robust set of editing features to enable moderation.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Face Detection

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. The Face Detection API uses a client-side SWF object to allow detection of major facial locations (eyes, mouth, nose, chin position, etc.) continuously notifying the host app of the face’s location within the frame.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Text to Sing

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. The Text to Sing API allows users to type in words and hear them sung back according to the melody of a preset song. Either male or female English speaking voices may be used, and background music can be inserted as well. This Flash-based service integrates well with all other Oddcast APIs.
Date Updated: 2014-07-11
Tags: [field_primary_category], [field_secondary_categories]

Shelley Powers (Burningbird)Writing from the Bleeding Edge

a tangled mess

Photo by Jonathan Arehart cc BY-NC-SA 2.0

One of the challenges writing a book on technology is not only do you need to put words together in some form of coherent, possibly even interesting, manner, but you also have to understand the underlying technology enough to be able to explain it to others.

You can't just "talk" about the technology, you have to understand it.

Not a problem, except when you're getting into bleeding edge technology, like some of the ECMAScript 6 objects I'm covering in the second edition of my JavaScript Cookbook.

All you can do is work with the object, work with the object, and work with the object until you go from, "I hate this object. I hate this object. I really f**king hate this object" to, "Oh hey, this object isn't so bad."

Then you can write about it in the book.


Jeremy Keith (Adactio)Chloe

I first started hanging out with Chloe at An Event Apart Boston in 2011. We bonded over a shared love of The Go-Betweens amongst other things.

Chloe and me

It was very easy to be in her company. She was inspiring. Literally.

We became conference buddies. Whether it was Build, Brooklyn Beta, Indie Web Camp or New Adventures, we’d inevitably end up sitting next to each other. It didn’t matter how long it had been since we had last seen each other, it always felt like no time had passed at all.

Jeremy and Chloe at lunch Chloe & Jeremy Chloe and Jeremy attendees Brooklyn Beta Jeremy & Chloeamused by Al's stories

Jessica and I spent a day with Chloe and her dad Julius as they showed us around the real Williamsburg when we were in Brooklyn a few years ago. “The Jew Tour” Chloe called it. It was clear that Julius and Chloe had a wonderful relationship—they weren’t just father and daughter; they were best friends.

Jessica, Julius and Chloe

Chloe and I hacked together at Science Hack Day San Francisco last year. It was so much fun, and I know that Chloe found it very empowering. She even gave a demo in public, which was quite an achievement for her; I remember how terrified she was at the thought of just having to introduce herself at Indie Web Camp in Portland a few years back.

IndieWebCamp 2011

I won’t see Chloe again. Chloe killed herself.

When I heard the news, I couldn’t believe it. I still can’t believe it.

Chloe and I would sometimes communicate online—email, IM, DMs on Twitter—but it was never quite the same as when we were together. I chatted with her just last week. I knew she was sad. I knew that she had many regrets. But I had no idea that she was contemplating suicide.

Now I wonder if there’s something I could have said. Or worse, what if I did say the wrong thing?

I think it’s only natural to look for these kind of causal relationships. “If only I had done X, I could have prevented Y.” But I suspect that the truth is not as simple as that. Still those questions haunt me.

But there’s also comfort. Seeing the overwhelming messages of grief and loss makes me realise how many people cared for Chloe. Even if you only met her briefly, you couldn’t help but be bowled over by her.

Smart, creative, funny, beautiful Chloe Weil.

I want to say how much I’ll miss her, but the truth is that I don’t think I’ve really grasped that she’s gone. I just can’t believe it.

Amazon Web ServicesAmazon Zocalo - Document Storage and Sharing for the Enterprise

I have been writing this blog for almost ten years! For most of that time, my workflow for writing and reviewing drafts has revolved around my email inbox. I write a draft and then hand it off to the Product Manager for review. The Product Manager, in turn, will hand the draft off to their team and to other stakeholders and reviewers within the company. Given the pace of AWS development, I often have between five and ten drafts underway at any given time. Reconciling overlapping suggestions for edits, sometimes spread across multiple drafts is tedious and error-prone. It is clear to me (and to my colleagues) that email inboxes are not appropriate venues for efficiently and securely sharing and reviewing complex documents. We decided to "scratch our own itch" and to create a document hub that would relieve the load on our inboxes and also add some structure to the process. Given that our enterprise customers have been asking us to provide them with secure storage and sharing, we decided to build a new product!

Introducing Zocalo
Today we are introducing Amazon Zocalo. This is a fully managed, secure document storage and sharing service designed specifically for the needs of the enterprise. As you will see as you review this post, Zocalo provides users with secure access to documents, regardless of their location, device, or formal relationship to the organization. As the owner of a document, you can selectively share it with others (inside or outside of your organization), and you can ask them for feedback, optionally subject to a deadline that you specify.

Zocalo gives you simple, straightforward access to your documents anytime and from anywhere, regardless of location or device. Zocalo supports versioned review and markup of a multitude of document types, and was designed to allow security-conscious administrators to control and audit access to accounts and documents.

With centralized user management (optionally linked to your existing Active Directory) and tight control over sharing, Zocalo prevents boundaries from becoming accidentally blurred. All documents are stored in a designated AWS Region and transmitted in encrypted form. You, as the document owner, can even opt to disallow downloading for extra protection.

You can install the Zocalo client application on your desktop and laptop computers running Windows 7 or MacOS (version 10.7 or later) and designate a folder for syncing. Once you do so, saving a file to the folder will automatically upload them to Zocalo across an encrypted connection and sync them to your other devices. You can also access Zocalo from your iPad, Kindle Fire, and Android tablets.

In the remainder of this post I will take a look at Zocalo from three points of view. You will see what it is like to be a document owner, a reviewer, and a Zocalo administrator.

Zocalo for Users
Assuming that you are already known to Zocalo (see Zocalo for Administrators to learn more about accounts and passwords), you can simply visit your organization's Zocalo site and log in. The URL to the site is specific to the organization. Here's where I started:

And here's what I saw after I logged in:

I can create folders and sub-folders as needed and I can add documents to the folder by simply dragging and dropping. I uploaded an early draft of this post:

Zocalo can accommodate files of up to 5 GB. You can upload files of any type; Zocalo will render Office documents, PDFs, images, and text files.

I shared the draft with Paul and Cynthia (two of my colleagues) and asked them to review it for me:

Zocalo shows me their status:

As you may have noticed earlier, I can create folders in Zocalo and store my documents inside. Permissions applied to a folder apply to all of the documents within it, making it easy for me to use folders to organize my documents by project or by team.

I took a short break to check on my garden (it was doing fine) and waited for some feedback. I clicked on Activity to see how things were going. Paul and Cynthia had both left comments within a few minutes (we work at lightning speed at Amazon):

Then I clicked on Feedback to see what they had to say about my first draft. The feedback is organized by version, and is further broken down into an overall comment and individual items, grouped by page:

Then I clicked to see what Paul had to say:

As you can see, clicking on a piece of feedback highlights the target area in the document and also connects it to the comment. Each reviewer has their own, unique color code as well.

The next step for me would be to read and digest all of the comments, edit the document, and upload another version for further review using the menu at the top:

If the document is in Microsoft Word format, I can also download a version that includes all of the comments entered by the reviewers.

There's a lot more to cover, but I'm just getting started and this post is already kind of long! You can try this out for yourself through the Zocalo Limited Preview.

Zocalo for Reviewers
Now I'd like to take a look at the sharing and reviewing process from the other side of the fence. I can easily see the documents that have been shared with me for review:

I can click on any document and open it up to read and comment on it. Zocalo shows me how to give feedback in a handy popup:

I simply highlight any text or any region of the document and enter my feedback:

When I have finished my review I need to send the comments to the owner of the document with a click of the Send button:

As you saw earlier, the owner of the document will be able to see my edits and will (with any luck) use them to produce another version.

Once again, I have just scratched the surface of the document sharing and review features that are available in Zocalo. Let's take a look at the administrative side of Zocalo!

Zocalo for Administrators
Each Zocalo account must have at least one administrator. The administrator is responsible for creating and managing user accounts, setting up security policies, managing storage limits, and generating auditing and activity reports.

As the administrator in charge of setting up and running Zocalo, you will begin with the AWS Management Console:

You can choose Quick Start to get going quickly or Full Setup to connect to your on-premises user directory.

I chose the Quick Start and entered a few parameters to get started:

Minutes later my site was all set up and I was ready to go, with notification via a convenient email:

I set up a password and become the official administrator of my very own Zocalo site!

I logged in and explored the Dashboard:

The Dashboard allows me to set the amount of storage per Zocalo user. By default, new users get 200 GB of storage for free. The administrator can choose to allow additional storage, which is billed on a per-GB, per-month basis.

I can control the level of document sharing for the site — unlimited external sharing, sharing to a short list of domains, or no external sharing:

Here's how I enter a list of domains:

I can also manage the invitation model. Users can be allowed to invite others within any domain or in a short list of domains, or this entire feature can be restricted to users with administrator privileges:

I can invite people to become new Zocalo users:

Once my Zocalo site has some users, I can monitor and control their storage utilization, and see an audit log of document activity.

Pricing and Availability
You can join the Zocalo Limited Preview to experience Zocalo on your own.

Zocalo was designed to work smoothly with Amazon WorkSpaces. Each WorkSpaces user has access to 50 GB of Zocalo storage, the Zocalo web application, the tablet apps, and document review at no additional charge. The Zocalo administrator can upgrade these users to 200 GB of storage for just $2 per user per month.

If you don't use Amazon WorkSpaces, Zocalo is priced at $5 per user per month, including 200 GB of storage for each user. Additional storage is billed on a per-GB, per-month basis using a tiered pricing model. See the Zocalo Pricing page for more info.

Zocalo is currently available in the US East (Northern Virginia), US West (Oregon), and Europe (Ireland) Regions. All documents for a particular Zocalo site are stored in encrypted form within the chosen Region.

-- Jeff;

Amazon Web ServicesNew AWS Mobile Services

The Mobile App Development Challenge
We want to make it easier for you to build sophisticated cloud-powered applications for mobile devices! User expectations are at an all-time high: they want to run your app on the device of their choice, they want it to be fast and efficient, and they want it to be secure. Here are some of the challenges that you will face as you strive to meet these expectations:

  • Authenticate Users - Manage users and identity providers.
  • Authorize Access - Securely access cloud resources.
  • Synchronize Data - Sync user preferences across devices.
  • Analyze User Behavior - Track active users and engagement.
  • Manage Media - Store and share user-generated photos and other media items.
  • Deliver Media - Automatically detect mobile devices and deliver content quickly on a global basis.
  • Send Push Notifications - Keep users active by sending messages reliably.
  • Store Shared Data - Store and query NoSQL data across users and devices.
  • Stream Real-Time Data - Collect real-time clickstream logs and react quickly.

Meeting the Challenge
Today we are introducing three new AWS products and services to help you to meet these challenges.

Amazon Cognito simplifies the task of authenticating users and storing, managing, and syncing their data across multiple devices, platforms, and applications. It works online or offline, and allows you to securely save user-specific data such as application preferences and game state. Cognito works with multiple existing identity providers and also supports unauthenticated guest users.

Amazon Mobile Analytics will help you to collect, visualize, and understand app usage, engagement, and revenue at scale. Analytics can be collected via the AWS Mobile SDK or a set of REST APIs. Metrics are available through a series of reporting tabs in the AWS Management Console.

The updated and enhanced AWS Mobile SDK is designed to help you build high quality mobile apps quickly and easily. It provides access to services specifically designed for building mobile apps, mobile-optimized connectors to popular AWS data streaming, storage and database services, and access to a full array of other AWS services. This SDK also includes a common authentication mechanism that works across all of the AWS services, client-side data caching, and intelligent conflict resolution. The SDK can be used to build apps for devices that run iOS, Android, and Fire OS.

Taken as a whole, these services will help you to build, ship, run, monitor, optimize, and scale your next-generation mobile applications for use on iOS, Android, and Fire OS devices. The services are part of the full lineup of AWS compute, storage, database, networking, and analytics services, which are available to you and your users from AWS Regions location in the United States, South America, Europe, and Asia Pacific.

Here is how the new and existing AWS services map to the challenges that I called out earlier:

Let's take a closer look at each of the new services!

Amazon Cognito
Amazon Cognito helps you identify unique users, retrieve temporary, limited-privilege AWS credentials and also offers data synchronization services.

As you might know, an Identity Provider is an online service that is responsible for issuing identification information for users that would like to interact with the service or with other cooperating services. Cognito is designed to interact with three major identity providers (Amazon, Facebook, and Google). You can take advantage of the identification and authorization features provided by these services instead of having to build and maintain your own. You no longer have to worry about recognizing users or storing and securing passwords when you use Cognito.

Cognito also supports guest user access. In conjunction with AWS Identity and Access Management and with the aid of the AWS Security Token Service, mobile users can securely access AWS resources and app features, and even save data to the cloud without having to create an account or log in. However, if they choose to do this later, Cognito will merge data and identification information if the user ultimately decides to log in. Here's how it all fits together:

Here's what you need to do to get started with Cognito:

  1. Sign up for an AWS Account.
  2. Register your app on the identity provider's console and get the app ID or token. This is an optional step; you can also choose to use only unauthenticated identities
  3. Create a Cognito identity pool in the Management Console.
  4. Integrate the AWS Mobile SDK; store and sync data in a dataset.

You can create and set up the identity pool in the Console:

Once your application is published and in production, you can return to the Console and view the metrics related to the pool:

Let's talk about Cognito's data synchronization facility! The client SDK manages a local SQLite store so that the application can work even when it is not connected. The store functions as a cache and is the target of all read and write operations. Cognito's sync facility compares the local version of the data to the cloud version, and pushes up or pulls down deltas as needed. By default, Cognito assumes that the last write wins. You can override this and implement your own conflict resolution algorithm if you'd like. There is a small charge for each sync operation.

Each identity within a particular identity pool can store multiple datasets, each comprised of multiple key/value pairs:

Each dataset can grow to 1 MB and each identity can grow up to 20 MB.

You can create or open a dataset and add key/value pairs with a couple of lines of code:

DataSet *dataset = [syncClient openOrCreateDataSet:@"myDataSet"];
NSString *value = [dataset readStringForKey:@"myKey"];[dataset putString:@"my value" forKey:@"myKey"];

Charges for Cognito are based on the total amount of application data stored in the cloud and the number of sync operations performed on the data. The Amazon Cognito Free Tier provides you with 10 GB of sync store and 1,000,000 sync operations per month for the first 12 months of usage. Beyond that, sync store costs $0.15 per GB of storage per month and $0.15 for each 10,000 sync operations.

Take a look at the Cognito documentation (Android and iOS) to learn more about this and other features.

Mobile Analytics
Once you have built your app, you need to track usage and user engagement, improving and fine-tuning the app and the user interaction in response to user feedback. The Amazon Mobile Analytics service will give you the information and the insights that you need to have in order to do this.

Using the raw data ("events") collected and uploaded by your application, Amazon Mobile Analytics automatically calculated and updates the following metrics:

  • Daily Active Users (DAU), Monthly Active Users (MAU), and New Users
  • Sticky Factor (DAU divided by MAU)
  • Session Count and Average Sessions per Daily Active User
  • Average Revenue per Daily Active User (ARPDAU)
  • and Average Revenue per Paying Daily Active User (ARPPDAU)
  • Day 1, 3, and 7 Retention
  • Week 1, 2, and 3 Retention
  • Custom Events

In order for your application to be able to upload events, you can create an identity pool and use the AWS Mobile SDK (or the REST API) to call the appropriate reporting functions. There are three types of events:

  • System - Start or end of a session
  • In-App Purchase - Transaction.
  • Custom - Specific actions within your application.

When you use the AWS Mobile SDK, the system events denoting the start and end of each session are sent automatically. Your application code is responsible for sending the other types of events at the appropriate time.

All of the metrics are available from within the AWS Management Console, broken down by tab:

The main page includes plenty of top-level information about your app and your users:

You can click on a tab to learn even more:

You can also filter by application, date range, and/or platform, as needed:

Pricing is based on the number of events that your app generates each month. The first 100 million events are free; beyond that, you will be charged $1.00 for each million events.

AWS Mobile SDK
Last, but definitely not least, I would like to say a few words about the updated and expanded AWS Mobile SDK ! This SDK makes it easy for you to build applications for devices that run the iOS, Android, or Fire OS operating systems.

Here are some of the new features:

Object Mapper - A new DynamodB Object Mapper for iOS makes it easy for you to access DynamoDB from your mobile apps. It enables you to map your client-side classes to Amazon DynamoDB tables without having to write the code to transform objects into tables and vice versa. The individual object instances map to items in a table, and they enable you to perform various create, read, update, and delete (CRUD) operations on items, and to execute queries.

S3 Transfer Manager - Amazon S3 Transfer Manager makes it easy for you to upload and download files from Amazon S3 while optimizing for performance and reliability. Now you can pause, resume, and cancel file transfer using a simple API. We have rebuilt the iOS S3TransferManager to utilize BFTask. It has a clean interface, and all of the operations are now asynchronous.

Android and Fire OS Enhancements - In addition to support for the services announced in this post, the SDK now supports Amazon Kinesis Recorder for reliable recording of data streams on mobile devices, along with support for the most recent SQS, SNS, and DynamoDB features. It also allows active requests to be terminated by interrupting the proper thread.

iOS / Objective-C Enhancements - The SDK supports ARC and BFTask, and conforms to the best practices for the use of Objective-C. It also supports Cocoapods, and can be accessed from Apple's new Swift language.

-- Jeff;

PS - Special thanks to my friend and colleague Jinesh Varia for creating the screen shots and diagrams in this post!

Norman Walsh (Sun)The short-form week of 30 Jun–6 Jul 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 18 messages in 24 conversations. (With 9 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

In a conversation that started on Monday at 03:46pm

Yes,,,, and a few other sites are down. Should be back tomorrow.—@ndw
@ndw is down (used in your flickr1.xsl library). Is this related to the other sites?—@erikespana
@ndw I noticed it today. Do you have an estimate of when that file will be back online?—@erikespana
@ndw What about hosting the file on your github account, until your websites are restored?—@erikespana
@erikespana I'll see what I can do today. Not sure what the story is...—@ndw
@erikespana I didn't think that was down. Other sites still are :-(—@ndw

Monday at 10:10am

So, for those of you keeping score at home, in order of people-ness it's 1. Dudes, 2. Corporations, 3. Fetuses, 4. Women.—@AngerMonkey

Monday at 10:20am

If only there were some connecting thread between the Blackwater, Hobby Lobby, and Facebook stories. And Citizens United.—@kissane

Monday at 05:10pm

I hope we get a bunch of Quaker-owned corporations demanding a religious exemption from taxes that pay for war.—@jacremes

In a conversation that started on Tuesday at 08:01am

@kendall So you could participate more viscerally in boycotting them?—@ndw
@ndw yes—@kendall

In a conversation that started on Tuesday at 09:59am

You make a router. It has an admin page. You make the admin page require JavaScript. You are a moron.—@ndw
@ndw why?—@phidip
@phidip Because it means, for example, that I can't administer the router from a shell window with something like links.—@ndw
@ndw wouldn’t lynx and/or links work? Never had a reason to check js on them, just speculating.—@phidip
@phidip No, because JavaScript.—@ndw

Thursday at 10:49am

Modern society is (largely) about using science & tech to avoid otherwise natural consequences. GOP explicitly rejects modernity for women.—@kendall

Thursday at 11:20am

RT @tomcoates: Unless you contribute, will miss its $5 million goal by ONE DAY! That would suck! Get on it, interne…—@ndw

Thursday at 11:25am

RT @mdz: You can donate to @MayOneUS, and help end the corrupting influence of money on politics, via credit card, PayPal or bitcoin.—@ndw

Thursday at 05:39pm

If your *home page* requests that I “Connect with {Google,Facebook,Email}” before I see anything about your app, you can rot in Hell.—@pgor

Friday at 08:54am

@jzawodn Never!—@ndw

Friday at 07:52pm

RT @dcm: Its this: no matter what the most important issue is to you... its overshadowed by money in politics. This is why…—@ndw

Friday at 07:55pm

RT @MayOneUS: Never tell me the odds. #PledgeIndependence —@ndw

Friday at 10:16pm

RT @bsletten: Boom! #MaydayPAC —@ndw

Friday at 10:18pm

RT @neiltyson: Enjoying colorful fireworks tonight? Thank Aluminum Barium Calcium Chlorine Copper Iron Nitrogen Oxygen Sodium & Strontium.—@ndw

Saturday at 08:37am

life wisdom: “Don’t be mean; we don’t have to be mean, because, remember, no matter where you go, there you are” Buckaroo Banzai—@amyvdh

Saturday at 09:38am

Definitely time to resubscribe to Linux Journal and setup a TOR relay server.—@ndw

Saturday at 10:12am

Saturday at 01:16pm

.@MayOneUS @ndw Unless '16 Congress seriously addresses income disparity, money in politics will exist regardless of reform.—@lyda

Sunday at 03:29pm

those who don’t There are two types of people in the world: those who understand asynchronous jokes and—@holman

In a conversation that started on Sunday at 04:49pm

I have visited most but not all these United States. I sincerely hope that I never have cause to visit Mississippi. I might refuse to go.—@ndw
@ndw I grew up next door in Louisiana but there are nice places Mississippi and some very sad places.—@patrickDurusau
@ndw it's incredibly beautiful and the people can be very warm; I hope I never go back.—@kendall

Sunday at 04:51pm

Priority task for the coming week: get my websites back online.—@ndw

In a conversation that started on Sunday at 06:01pm

Note to self: eat more oysters.—@ndw
@ndw Whilst avoiding the bad ones?—@dpawson

Sunday at 06:17pm

There is nothing more patriotic than exposing unjust and unfair government practices through a free press. That is democracy. It works.—@codinghorror

Norman Walsh (Sun)The short-form week of 23–29 Jun 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 7 messages in 12 conversations. (With 7 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

Monday at 07:50am

One Bumper Sticker I Actually Agree With #atheism —@denyreligion

In a conversation that started on Monday at 03:46pm

Yes,,,, and a few other sites are down. Should be back tomorrow.—@ndw
@ndw is down (used in your flickr1.xsl library). Is this related to the other sites?—@erikespana
@ndw I noticed it today. Do you have an estimate of when that file will be back online?—@erikespana
@ndw What about hosting the file on your github account, until your websites are restored?—@erikespana
@erikespana I'll see what I can do today. Not sure what the story is...—@ndw
@erikespana I didn't think that was down. Other sites still are :-(—@ndw

Tuesday at 02:06am

Brilliant. RT @rachelcdavies RT @pgmrsbeingdicks You know what they say: Don’t fuck with The Culture. —@tastapod

Tuesday at 12:17pm

Chemistry humor —@SciencePorn

Tuesday at 07:47pm

* balance 12 turtles in a pile * balance a wine glass on top * leave for a day * work out which turtle broke the glass #SoftwareEngineering —@yoz

In a conversation that started on Wednesday at 09:16am

Looking for a project for a summer intern? How about an XInclude 1.1 implementation? I'm just sayin'...—@ndw
@ndw I'll send a feature request to @oxygenxml. The idfixup stuff is in, but oXygen can't validate docs that use it.—@cramerdw

Wednesday at 10:45pm

@laurendw @timbray Australian, Singaporean, and Welch, were it's top three guesses for me!—@ndw

Thursday at 04:02pm

I have learned that being a technical leader means you must delegate solving complex problems, but you cannot delegate understanding them.—@hillbrad

Thursday at 05:38pm

Space exploration makes a lot more sense when you think of it as humanity's off-site backup plan.—@codinghorror

In a conversation that started on Thursday at 05:57pm

Toothpicks soaked in Islay single malt then kiln dried. Because you can never be too hipster.—@ndw
@ndw now you need to make them and report back!—@alexmilowski
@ndw You have my attention, sir. :)—@mdubinko
@mdubinko Perhaps I should have been more clear. I saw an ad for a $35 box of toothpicks, I didn't actually buy them!—@ndw
@ndw I thought you made them…—@laurendw
@laurendw @ndw I thought he was sneaking liquor into somewhere it shouldn't be!!!—@dpawson
@ndw Ahh, I thought you were describing a maker-style project you undertook.—@mdubinko

Friday at 10:58am

Friday at 04:19pm


Norman Walsh (Sun)The short-form week of 16–22 Jun 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 8 messages in 15 conversations. (With 5 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

In a conversation that started on Sunday at 02:31am

Huh, @TripIt just sent me a totally spurious gate change announcement. That's a first. Luckily it was obvious. (UA41 dep from T5 not T3)—@ndw
@ndw Hi Norman, please check your e-mail for feedback from the Tripit Support team.—@TripIt

In a conversation that started on Monday at 05:53am

Camera malfunction. #olympus #ep5 #photography :-(—@ndw
@ndw Oh, and happy birthday Norm. 21 at last (or should that be least?). Have a good one.—@dpawson
@dpawson Yeah. In base 23. But thanks!—@ndw

Monday at 06:47am

@ndw Happy birthday...!—@jpcs

In a conversation that started on Monday at 06:58am

RT @jpcs: @ndw Happy birthday...! Want a cake? —@jirkakosek
@jirkakosek @jpcs That's delightful, Jirka! Thank you so much!—@ndw
@ndw Many happy returns Norm—@adamretter
@jirkakosek @jpcs @ndw Love it Jirka!—@dpawson

Monday at 07:14am

XML Stars, the journal is out! Stories via @ndw —@dominixml

Monday at 01:36pm

@ndw Happy BDay! May your cake be well-formed containing a hierarchy of fun in an epic namespace because everything should be in a namespace—@peteaven

Monday at 02:13pm

RT @shelleypowers: Mother Earth gently explaining our place in this world —@ndw

Tuesday at 04:42am

Advertising as microagression. Discuss.—@j4

In a conversation that started on Tuesday at 06:03am

Up at 4:30. Can't eat until after doctor's appointment at 8:30. #badcombination #jetlag —@ndw
@ndw and then it is off to the local diner!—@alexmilowski

Tuesday at 08:57pm

It seems @doctorow was terrifyingly prophetic: @coreplane —@puellavulnerata

Tuesday at 11:31pm

I like this, this is the kind of conversation about tech we need to be having—@tsmuse

Tuesday at 11:36pm

Content-type: text/plane 🛧🛪🛨🛦🛫🛦 🛫🛨 🛨✈🛦🛧🛦 ✈🛬🛫🛫🛬🛫 🛩🛦✈🛩🛬🛬🛨 🛫 🛬🛨✈ 🛩 🛨🛫 🛩 🛩🛨🛬✈🛦 🛦✈🛪 🛦 ✈🛩✈ ✈ 🛬🛬🛧🛧🛩 🛧🛦🛦 🛨✈🛦 🛪🛨🛨 🛬🛧🛩 🛧✈🛪🛩🛨 🛦 🛩🛪🛫🛩 🛧🛬🛨🛩🛦✈ 🛨 🛩🛫🛨✈🛦 🛬🛦🛪—@FakeUnicode

Thursday at 06:17am

Everything is vague to a degree you do not realize till you have tried to make it precise.—@B_RussellQuotes

Thursday at 09:04am

"Want to start your own business in Belgium?" #spamsubjectoftheday —@ndw

Thursday at 09:56am

The @SavedYouAClick account is really quite clever. I rarely click on link bait headlines anyway, but it's fun to see the summaries.—@ndw

Norman Walsh (Sun)The short-form week of 9–15 Jun 2014

<article class="essay" id="content" lang="en">

The week in review, 140 characters at a time. This week, 14 messages in 25 conversations. (With 11 favorites.)

This document was created automatically from my archive of my Twitter stream. Due to limitations in the Twitter API and occasional glitches in my archiving system, it may not be 100% complete.

In a conversation that started on Monday at 04:29am

Powered by Java (Sun-era logo): Italian coop supermarket checkout terminals.—@ndw
@ndw pics or it didn't happen.—@fsanders
@fsanders Yeah, I know. Photographing someone else's grocery bill seemed a bit intrusive for me. Besides, what, you don't believe me?—@ndw
@ndw misunderstood. I was envisioning intel inside type logos on the pos machines. Also, I do but, FOR POSTERITY!—@fsanders

Monday at 05:21am

Homeopaths seem to want official recognition by the government. Government ought to provide such recognit... —@tommorris

Monday at 07:14am

XML Stars, the journal is out! Stories via @ndw —@dominixml

Tuesday at 05:53pm

RT @tomcoates: Finally, if you haven't done so, you should stop donating to either the Republicans or the Democrats and donate to http://t.… —@ndw

Wednesday at 09:52am

I WENT through a TIME LORD DRIVE-THRU. They ASKED for my ORDER. I replied 'JUST a MOMENT'. They said, 'WILL THAT BE ALL?' #TimeHumour —@DalekThay

Thursday at 12:03pm

Current status: —@ndw

Thursday at 05:40pm

Loving this documentation of short-URL bad practice, implicating Twitter, Slate and Google. Madness! —@torgo

Thursday at 10:33pm

Next generation: do me a favor and stop signing up for Wall Street. Go into the sciences, okay? Science is WAY COOLER than greed. Thanks—@kristenschaaled

Friday at 02:10am

Between the enormous chains of redirects and the generally poor performance of, I abandon a lot of clicks unseen.—@ndw

Friday at 02:16am

RT @glentickle: Someone gave my wife a "Chemical Free Weed Killer" recipe. I made some corrections. —@ndw

Saturday at 02:10am

The human mind is amazing. It shows that physics can do philosophy.—@sjzara

Saturday at 10:06am

Collectively, the country of France has a giant middle finger made if bread aimed at the gluten-free movement.—@dcm

Saturday at 10:07am

Time is a spiky rubber ball and I gnaw it until it stops squeaking.—@I_AM_PEPITA

In a conversation that started on Saturday at 10:22am

Bunking down at FCO tonight. Home tomorrow. Work on Monday. Can someone remind me again, what is I do for a living? #lovelyvacationending —@ndw
@ndw I share the feeling. Sent the laptop home today, without reading the email backlog. Great to have people minding the shop for me.—@michaelhkay

Saturday at 06:37pm

The two teams would score way more goals between them if they'd just collaborate. I don't understand this game. #WorldCup2014 —@tomcoates

Saturday at 07:40pm

When Jesus said love your enemies, he meant us to take that seriously. #FUM2014 —@robinmsf

Sunday at 01:30am

BA flies the 787 between AUS and LHR. Sweet!—@ndw

Sunday at 01:37am

In the Alitalia lounge at FCO surrounded by pictures of popes in cockpits...must not channel Tim Minchin, must not channel...—@ndw

Sunday at 02:15am

RT @secretGeek: "Reading a book" is a classic important but non urgent task. When your lifestyle lacks any book time, you know you're in th…—@ndw

In a conversation that started on Sunday at 02:31am

Huh, @TripIt just sent me a totally spurious gate change announcement. That's a first. Luckily it was obvious. (UA41 dep from T5 not T3)—@ndw
@ndw Hi Norman, please check your e-mail for feedback from the Tripit Support team.—@TripIt

In a conversation that started on Sunday at 12:18pm

My twitter nick got caught up in some gun nut thread. Holy cow! The crazy, it burns!—@ndw
@ndw May have to find asylum in the local Chipotle's.—@aljopainter

Sunday at 12:44pm

Bush's very public vow not to criticize Obama is a classic case of...meh, fuck that vile dolt.—@kendall

Sunday at 02:05pm

RT @JoePasqua: “@BBCiPlayer is now completely built on the NoSQL @MarkLogic stack” Also powered the 2012 Olympics site.… —@ndw

Sunday at 05:24pm

When I nod at your cultural references, it's not because I know who you are talking about. It's because I don't care enough to ask.—@michaelianblack

Sunday at 07:24pm

Inbox +3119. #backfromvacation #itwasworthit —@ndw

ProgrammableWeb: APIsOddcast AV Capture

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. The Oddcast AV Capture Module allows a user to capture the audio and video of any event that that takes place on a webcam or on a screen within a Flash application, and packages the video up in an easy to share MP4 or FLV format. This Actionscript 3.0 API allows the application to customize cropping area, dimensions, and output type of the videos.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Audio Engine

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. The Audio Engine API allows developers to integrate audio editing such as mixing, transcoding, stitching, and sequencing into their sites and applications. This JavaScript/ActionScript API allows mixing with a maximum of 120 channels, and transcodes into and from all common audio types. It is also designed to function well with the Oddcast Voice Toolkit API to receive and edit speech. Oddcast offers powerful and high capacity cloud-computing to service these requests.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Voice Toolkit

Oddcast offers a set of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. Oddcast's Voice Toolkit API allows users to record or create dialogue within an application. The toolkit comes with 3 components. “Text-to Speech”: users can input text to create an audio version of their message, with 200 voices and 30 languages available. “Record-by-Phone”: an ActionScript and flash-based API interface allowing users to call from a toll free number to record an Mp3 audio clip up to 30 seconds in length. “Record-by-Mic”: an API and XML interface which allows users to record an Mp3 audio clip up to 30 seconds in length via a microphone. In all scenarios, users can preview audio clips. Oddcast offers hosting services as well as ongoing customer support.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsOddcast Text to Speech

Oddcast offers a suite of APIs that allow developers to add tools to enterprise level platforms to enhance the user’s engagement with their media. Oddcast’s Text to Speech API allows developers to integrate text to speech functionality into any web or mobile application. The API supports 20 language types, including emotive cues and special audio effects, and offers a library of over 185 voices. It is compatible with dynamic web applications, supporting Flash & JavaScript. It also allows for admin reporting and profanity filtering to track usage.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsEmoVu

EmoVu specializes in creating emotionally intelligent tools that can perceive the emotions of people by analyzing micro expressions using webcams. Supporting standard video, image, as well as image sequence formats, this emotive predictive analysis can help determine a user's opinion on the content they are viewing in real time. This allows developers to smartly tailor content to highly niched demographics, using data acquired on age, gender, mood, and more to affect content either before or after the content debuts, ideally helping to boost ROI for lean campaigns. This API uses GPU capability for increased processing power, returning nearly 20 unique metrics per user.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsLambda Labs Face Recognition

The Face Recognition API by Lambda Labs allows developers to send a picture link to their service to detect age, gender, and the location of where facial features are, including eye position, nose position, and mouth position. Developers can create an album of photos, and this API will allow analyze and compare new photos against the library, displaying results and percentage compatibility with existing photos. Signing up allows developers to call this RESTful-friendly free API programmatically.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsBiblio Commons

Biblio Commons is a social virtual library that helps users to integrate Biblio Commons data into a website and other mobile applications. The core of Biblio Commons is to promote discovery, searching and borrowing in a more integrated manner. This API is particularly useful for developers working in libraries and are interested to replace the traditional functionality of a library into a set of interactive experiences that users might find more convenient. The API is about the integration of data collection in a social and interactive place. Some of the features of the API include record synchronization, easy configuration and uniform integration. The Biblio Commons API supports REST features. Right now developers can send HTTPS requests and receive JSON objects. Developers can find a complete list of URL structure, authentication, and resources.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

ProgrammableWeb: APIsCoca Cola Enterprises

Coca Cola Enterprises intends to connect to the community with its virtual sustainability map featured in their website. At the same time, CCE recommends the use of their API so developers are able to build a collaborative environment online and offline. This API is valuable because companies might benefit from the connections between a product, public responsibility and social commitment. The API supports JSON and XML formats. Developers can find documentation with product, location, customer and supplier navigation. Access to the information varies if the developer is public or partner. The CCE API site encourages to explore the documents using API Key and location methods. Discussion forum is available for developer support.
Date Updated: 2014-07-10
Tags: [field_primary_category], [field_secondary_categories]

Amazon Web ServicesNew Location for CloudFront and Route 53 - Melbourne, Australia

I am happy to announce the launch of an edge location in Melbourne, Australia for Amazon CloudFront and Amazon Route 53.

Global Coverage
This new location will improve performance and availability for end users of applications being served by CloudFront and Route 53 and bring the total number of AWS edge locations to 52 worldwide. Here's the breakdown:

  • United States (20)
  • Europe (16)
  • Asia (12)
  • Australia (2)
  • South America (2)

CloudFront and Route 53 customers don't need to do anything to your applications to take advantage of this new edge location - requests from your end users in these locations will automatically be routed for the best possible performance.

Full Functionality
This new edge location supports all Amazon CloudFront functionality, including accelerating your entire website (static, dynamic and interactive content), live and on-demand streaming media, and security features like custom SSL certificates, private content, and geo-restriction of content. It also supports all Amazon Route 53 functionality including health checks, DNS failover, and latency-based routing.

Learn More at our Webinars
If you are interested in using CloudFront and would like to learn more about the newest features, please consider joining us for the following online events:

-- Jeff;


Updated: .  Michael(tm) Smith <>