On The Importance of Backups and Personal Information Security :)



What Happened When One Man Pinged the Whole Internet

A home science experiment that probed billions of Internet devices reveals that thousands of industrial and business systems offer remote access to anyone.

You probably haven’t heard of HD Moore, but up to a few weeks ago every Internet device in the world, perhaps including some in your own home, was contacted roughly three times a day by a stack of computers that sit overheating his spare room. “I have a lot of cooling equipment to make sure my house doesn’t catch on fire,” says Moore, who leads research at computer security company Rapid7. In February last year he decided to carry out a personal census of every device on the Internet as a hobby. “This is not my day job; it’s what I do for fun,” he says.
Moore has now put that fun on hold. “[It] drew quite a lot of complaints, hate mail, and calls from law enforcement,” he says. But the data collected has revealed some serious security problems, and exposed some vulnerable business and industrial systems of a kind used to control everything from traffic lights to power infrastructure.
Moore’s census involved regularly sending simple, automated messages to each one of the 3.7 billion IP addresses assigned to devices connected to the Internet around the world (Google, in contrast, collects information offered publicly by websites). Many of the two terabytes (2,000 gigabytes) worth of replies Moore received from 310 million IPs indicated that they came from devices vulnerable to well-known flaws, or configured in a way that could to let anyone take control of them.
On Tuesday, Moore published results on a particularly troubling segment of those vulnerable devices: ones that appear to be used for business and industrial systems. Over 114,000 of those control connections were logged as being on the Internet with known security flaws. Many could be accessed using default passwords and 13,000 offered direct access through a command prompt without a password at all.
Those vulnerable accounts offer attackers significant opportunities, says Moore, including rebooting company servers and IT systems, accessing medical device logs and customer data, and even gaining access to industrial control systems at factories or power infrastructure. Moore’s latest findings were aided by a similar dataset published by an anonymous hacker last month, gathered by compromising 420,000 pieces of network hardware.
The connections Moore was looking for are known as serial servers, used to connect devices to the Internet that don’t have that functionality built in. “Serial servers act as glue between archaic systems and the networked world,” says Moore. “[They] are exposing many organizations to attack.” Moore doesn’t know whether the flaws he has discovered are being exploited yet, but has released details on how companies can scan their systems for the problems he uncovered.
Joel Young, chief technology officer of Digi International, manufacturer of many of the unsecured serial servers that Moore found, welcomed the research, saying it had helped his company understand how people were using its products. “Some customers that buy and deploy our products didn’t follow good security policy or practices,” says Young. “We have to do more proactive education for customers about security.”
Young says his company sells a cloud service that can give its products a private, secured connection away from the public Internet. However, he also said that Digi would continue to ship products with default passwords, because it made initial setup smoother, and that makes customers more likely to set their own passwords. “I haven’t found a better way,” he says.
Billy Rios, a security researcher who works on industrial control systems at security startup company Cylance, says Moore’s project provides valuable numbers to quantify the scale of a problem that is well-known to experts like himself but underappreciated by companies at risk.
Rios says that in his experience, systems used by more “critical” facilities such as energy infrastructure are just as likely to be vulnerable to attack as those used for jobs such as controlling doors in a small office. “They are using the same systems,” he says.
Removing serial servers from the public Internet so that they are accessed through a private connection could prevent many of the easiest attacks, says Rios, but attackers could still use various techniques to steal the necessary credentials.
The new work adds to other significant findings from Moore’s unusual hobby. Results he published in January showed that around 50 million printers, games consoles, routers, and networked storage drives are connected to the Internet and easily compromised due to known flaws in a protocol called Universal Plug and Play (UPnP). This protocol allows computers to automatically find printers, but is also built into some security devices, broadband routers, and data storage systems, and could be putting valuable data at risk.
Data collected by Moore’s survey has also helped Rapid7 colleagues identify how a piece of software called FinFisher was used by law enforcement and intelligence agencies to spy on political activists. It also helped unmask the control structure for a long-running campaign called Red October that infiltrated many government systems in Europe.
Moore believes the security industry is overlooking some rather serious, and basic, security problems by focusing mostly on the computers used by company employees. “It became obvious to me that we’ve got some much bigger issues,” says Moore. “There [are] some fundamental problems with how we use the Internet today.” He wants to get more people working to patch up the backdoors that are putting companies at risk.
However, Moore has no plans to probe the entire Internet again. Large power and Internet bills, and incidents such the Chinese government’s Computer Emergency Response Team asking U.S. authorities to stop Moore “hacking all their things” have convinced him it’s time to find a new hobby. However, with plenty of data left to analyze, there will likely be more to reveal about the true state of online security, says Moore: “We’re sitting on mountains of new vulnerabilities.”

By Tom Simonite

Bilişim Konusunda Mahkemelerde Uzmanlaşmaya Doğru İlk Adım!

30 Nisan 2013 SALI
Resmî Gazete
Sayı : 28633
Kanun No. 6460
Kabul Tarihi: 17/4/2013

MADDE 5 – 2576 sayılı Kanuna aşağıdaki geçici madde eklenmiştir.
“GEÇİCİ MADDE 19 – Bu maddeyi ihdas eden Kanunla, bu Kanunun 2 nci maddesinin (4) numaralı fıkrasında yapılan değişikliğin yürürlüğe girdiği tarihten itibaren altı ay içinde Hâkimler ve Savcılar Yüksek Kurulu tarafından mahkemeler arasındaki iş bölümüne ilişkin olarak karar verilir ve bu karar Resmî Gazete’de yayımlanır. Kararda mahkemelerin iş bölümü esaslarına göre çalışmaya başlayacakları tarih de belirtilir. Mahkemelerin iş bölümü esaslarına göre çalışmaya başlayacakları tarihe kadar açılmış olan davaların görülmesine bulundukları mahkemelerde devam olunur.”

PayPal streamlines the mobile shopping experience with its new Log In With PayPal identity solution

PayPal is rolling out a new identity solution designed to help streamline the mobile shopping experience. Unveiled at the Future Insights conference in Las Vegas, the payment company is launching Log In With PayPal. With this service, developers and third-party commerce sites can easily help shoppers pay for what they want with as few swipes and information needed — but still in a secure environment.
Log In with PayPal is not PayPal’s competitor to Facebook Connect or Google+ Sign In. However, it does leverage the OAuth 2.0 protocol that Facebook uses to authenticate users. The idea is that the company’s 128 million account holders can simply complete their purchase through the use of their username and passsword, or mobile number and PIN as a confirmation of their identity.
Damon Hougland, PayPal’s Senior Director of Identity, says that as more people shop using mobile devices, merchants and retailers are facing the challenge of making it easier to shop and pay. Log In with PayPal makes it possible so that when a purchase is made, customers need only enter in their login credentials and PayPal will take care of the rest. Typically, a business might ask for more information about the customer for whatever purposes. The more fields someone needs to fill out, the greater the chance for shopping cart abandonment.
lipp login PayPal streamlines the mobile shopping experience with its new Log In With PayPal identity solution
Log In with PayPal takes data already associated with a customer’s PayPal account and passes it to the merchant in a secure manner. After entering in the proper credentials, the customer will be displayed information about their account and whether they would like to agree to share it with the business. Once agreed, the transaction is complete and no additional information is needed.
This new login version is a replacement for PayPal Access and is meant to help provide a higher level of assurance and trust to both parties to verify that the identities are true. Of course, if you update your information in your account, an update will be sent to those services using Log In with PayPal so that they have the latest information.
Developers interested in integrating Log In with PayPal into their services can take advantage of it starting today. Hougland says that it uses open-based technologies such as OAuth and JSON. The creation of this new service comes from PayPal’s release of its new RESTful API at South by Southwest last March.

By Ken Yeung

Pentagon Paying China — Yes, China — To Carry Data

The Pentagon is so starved for bandwidth that it’s paying a Chinese satellite firm to help it communicate and share data.

U.S. troops operating on the African continent are now using the recently-launched Apstar-7 satellite to keep in touch and share information. And the $10 million, one-year deal lease — publicly unveiled late last week during an ordinarily-sleepy Capitol Hill subcommittee hearing — has put American politicians and policy-makers in bit of a bind. Over the last several years, the U.S. government has publicly and loudly expressed its concern that too much sensitive American data passes through Chinese electronics — and that those electronics could be sieves for Beijing’s intelligence services. But the Pentagon says it has no other choice than to use the Chinese satellite. The need for bandwidth is that great, and no other satellite firm provides the continent-wide coverage that the military requires.

“That bandwidth was available only on a Chinese satellite,” Deputy Assistant Secretary of Defense for Space Policy Doug Loverro told a House Armed Services Committee panel, in remarks first reported by InsideDefense.com. “We recognize that there is concerns across the community on the usage of Chinese satellites to support our warfighter. And yet, we also recognize that our warfighters need support, and sometimes we must go to the only place that we can get it from.”

The Apstar-7 is owned and operated by a subsidiary of the state-controlled China Satellite Communication Company, which counts the son of former Chinese premier Wen Jiabao as its chairman. But the Pentagon insists that any data passed through the Apstar-7 is protected from any potential eavesdropping by Beijing. The satellite uplinks and downlinks are encrypted, and unspecified “additional transmission security” procedures cover the data in transit, according to Lt. Col. Damien Pickart, a Defense Department spokesperson.

“We reviewed all the security concerns, all of the business concerns with such a lease,” Loverro said. “And so from that perspective, I’m very pleased with what we did. And yet, I think the larger issue is we don’t have a clear policy laid out on how do we assess whether or not we want to do this as a department, as opposed to just a response to a need.”

Every new drone feed and every new soldier with a satellite radio creates more appetite for bandwidth — an appetite the military can’t hope to fill with military spacecraft alone. To try to keep up, the Pentagon has leased bandwidth from commercial carriers for more than a decade. And the next decade should bring even more commercial deals; in March, the Army announced it was looking for new satellite firms to help troops in Afghanistan communicate. According to a 2008 Intelligence Science Board study (.pdf) — one of the few public reports on the subject — demand for satellite communications could grow from about 30 gigabits per second to 80 gigabits a decade from now.

The Chinese are poised to help fill that need — especially over Africa, where Beijing has deep business and strategic interests. In 2012, China for the first time launched more rockets into space than the U.S. – including the Chinasat 12 and Apstar-7 communications satellites.

Relying on Chinese companies could be a problematic solution to the bandwidth crunch, however. U.S. officials have in recent years publicly accused Chinese telecommunications firms of being, in effect, subcontractors of Beijing’s spies. Under pressure from the Obama administration and Congress, the Chinese company Huawei was rebuffed in its attempts to purchase network infrastructure manufacturer 3Com; in 2010, Sprint dropped China’s ZTE from a major U.S. telecommunications infrastructure contract after similar prodding. Last September, executives from the Huawei and ZTE were brought before the House intelligence committee and told, in effect, to prove that they weren’t passing data back to Beijing. “There’s concern because the Chinese government can use these companies and use their technology to get information,” Rep. Dutch Ruppersberger, said at the time. The executives pushed back against the charges, and no definitive links to espionage operations were uncovered. But the suspicion remains. And it isn’t contained to these two firms.

“I’m startled,” says Dean Cheng, a research fellow and veteran China-watcher at the Heritage Foundation. “Is this risky? Well, since the satellite was openly contracted, they [the Chinese] know who is using which transponders. And I suspect they’re making a copy of all of it.”

Even if the data passing over the Apstar-7 is encrypted, the coded traffic could be used to give Chinese cryptanalysts valuable clues about how the American military obfuscates its information. “This is giving it to them in a nice, neat little package. I think there is a potential security concern.”

For his part, Loverro says the Department of Defense will be reviewing its procedures to ensure that future satellite communications deals both let troops talk and let them talk in private. The Pentagon will get another opportunity shortly: the Apstar-7 deal is up on May 14, and can be renewed for up to three more years.
By Noah Shachtman

New Documents on the Article 29 Working Party website

New documents have been published on the Article 29 Working Party
website under the title:
1) 25.04.2013 Letter from the Article 29 Working Party, addressed to Mr Philip Lowe, Director General for Energy, regarding the opinion on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems ("DPIA Template") - Annex
2) 23.04.2013 Letter from the Article 29 Working Party, addressed to IATA, regarding Checkpoint of the Future - Annex
3) 23.04.2013 Letter from the Article 29 Working Party, addressed to IATA, regarding New Distribution Capability
4) 22.04.2013 Reply Letter from Ms Le Bail, Director General of Justice, addressed to the Chairman of the Article 29 Working Party, regarding the publication of LIBE letter
5) 15.04.2013 Reply letter from the Article 29 Working Party, addressed to Mr. Manservisi, Director General of Home Affairs, regarding meeting on the follow up of the second Joint Review of the EU-US TFTP Agreement
6) 08.04.2013 Letter from Mr. Manservisi, Director General of Home Affairs addressed to Article 29 Working Party, regarding EU-US TFTP Agreement
7) 02.04.2013 Letter from the Chairman of the Article 29 Working Party addressed to Ms Le Bail, Director General of Justice, regarding the publication, by the Commission, of several letters by the Article 29 Working Party on the page set aside for the Working Party matters on the website of the Commission (DG Justice)
8) 26.03.2013 Letter from Ms Le Bail, Director General of Justice, addressed to the Chairman of the Article 29 Working Party, regarding the publication, by the Commission, of several letters by the Article 29 Working Party on the page set aside for the Working Party matters on the website of the Commission (DG Justice)
9) 05.03.2013 eMail from the Chairman of the Article 29 Working Party addressed to Ms Le Bail, Director General of Justice, regarding publication of documents at the Working Party's website, hosted by the European Commission


European Data Protection Authorities Adopted an Explanatory Document on Binding Corporate Rules (BCR) for Processors

The European data protection authorities, assembled in the Article 29 Working Party (WP29), adopted an explanatory document on Processor BCR in order to further explain the principles and elements to be found in Processor BCR set out in the Working Document 02/2012 (WP195) adopted on 6 June 2012.

Launched on 1 January 20131, BCR for processors are internal codes of conduct regarding data privacy and security, to ensure that transfers of personal data outside the European Union by a processor, who acts on behalf of his clients and under their instructions, will take place in accordance with the EU rules on data protection.

Therefore, Processor BCR shall be understood as adequate safeguards provided by a processor to a controller, in order to allow the latter to demonstrate to data protection authorities adequate protection and obtain, where required by national laws, the necessary authorisation for transfers of their personal data to the different entities of their processors (for example subprocessors and data centres).

The explanatory document adopted on 19 April 2013 is aimed at providing further guidance to companies on what shall be contained in Processor BCR, further to the table checklist adopted by the Working Party in June 2012 (WP195).
Processor organisations that wish to implement BCR for processors within their group shall apply with their lead DPA through the standard application form adopted on 17 September 2012. The application procedure is the same as the one for BCR for controllers, which means it is based on a process with a lead DPA and a system of mutual recognition involving a substantial number of European DPAs.


Harvard Professor Re-Identifies Anonymous Volunteers In DNA Study

A Harvard professor has re-identified the names of more than 40% of a sample of anonymous participants in a high-profile DNA study, highlighting the dangers that ever greater amounts of personal data available in the Internet era could unravel personal secrets.
From the onset, the Personal Genome Project, set up by Harvard Medical School Professor of Genetics George Church, has warned participants of the risk that someone someday could identify them, meaning anyone could look up the intimate medical histories that many have posted along with their genome data. That day arrived on Thursday.
Professor Latanya Sweeney, director of the Data Privacy Lab at Harvard, along with her research assistant and two students scraped data on 1,130 people of the now more than 2,500 who have shared their DNA data for the Personal Genome Project. Church’s project posts information about the volunteers on the Internet to help researchers gain new insights about human health and disease. Their names do not appear, but the profiles list medical conditions including abortions, illegal drug use, alcoholism, depression, sexually transmitted diseases, medications and their DNA sequence.
Of the 1,130 volunteers Sweeney and her team reviewed, about 579 provided zip code, date of birth and gender, the three key pieces of information she needs to identify anonymous people combined with information from voter rolls or other public records. Of these, Sweeney succeeded in naming 241, or 42% of the total. The Personal Genome Project confirmed that 97% of the names matched those in its database if nicknames and first name variations were included. She describes her findings here.
Sweeney has also set up a web page for anyone to test how unique their birthdate, gender and zip are in combination. When I tried it, I was the only match in my zip code, suggesting that I, like so many others, would be easy to re-identify. “This allows us to show the vulnerabilities and to show that they can be identified by name,” she said. “Vulnerabilities exist but there are solutions too.”
(Personal disclosure: I work closely with Professor Sweeney in the Harvard Department of Government on topics related to my book research on the business of personal data, but was not involved with this study).
On Thursday, researchers and participants in the Personal Genome Project gathered in Boston for a conference timed to mark the 60th anniversary of James Watson and Francis Crick’s publication of their discovery of the DNA double helix structure in April 1953. Sweeney and her research assistant set up a table at the conference where participants could find out whether they could easily be identified. Sweeney sought not to out the study participants, but rather to demonstrate to them how providing a little less information–for example, just birth year rather than exact birth date, and three digits rather than five or nine from the zip code–could help preserve anonymity for participants.
Several participants said they expected someone would one day re-identify them and said they were not particularly concerned. Volunteer Gabriel Dean said he was far more worried about another future threat forecast by the experiment, that one day criminals might be able to replicate DNA and place some at the scene of a crime. The conference took place a few blocks from the scene of the Boston Marathon bombing earlier this month.
Another “outed” particiapant, James Smith, a 59-year-old who lives outside Chicago, says he has an additional layer or protection because his name is so common. He said his genetic testing showed he had a greater possibility of developing Alzheimer’s disease than a typical person, but said he was “not worried about job discrimination, I’m not worried about health care,” he said. Smith is independently wealthy after having sold his company to Yahoo. “I’m retired.”
Volunteer Lenore Snyder, however, said that she did not want to be identified and as a result did not provide her zip code and some other identifying characteristics in her profile. She said her genetic testing suggests she has an intellectual disability, even though she is a molecular biologist with a PhD. “People don’t know how to interpret this,” she said. “It’s dangerous. A little bit of information is dangerous.”
Sweeney’s latest findings build on a 1997 study she did that showed she could identify up to 87% of the U.S. population with just zip code, birthdate and gender. She was also able to identify then Massachusetts Gov. William Weld from anonymous hospital discharge records.
The same techniques could be used to identify people in various surveys and records, pharmacy purchases, or from a wide variety of seemingly anonymous activities such as Internet searches. Figuring out clues about people could also enable identity theft. “I believe that many people in the current interconnected digital world are not aware of how easy it is to identify them with a high level of granularity,” says Keith Batchelder, the founder of Genomic Healthcare Strategies in Charlestown, Massachusetts, and one of the first ten volunteers in the Personal Genome Project.
Church, who maintains a thick mountain-man beard, says that advances in data and in medicine make it impossible to guarantee anonymity for most medical experiment volunteers. Church has participated as a volunteer himself in past medical studies and scoffs at claims that such data can remain anonymous. Every year his university sends him an anonymous survey. He scribbles in some additional information at the beginning of the form. “My name is George Church, you could figure that out anyway,” he writes.
His Personal Genome Project makes no privacy promises at all. “The Personal Genome Project is a new form of public genomics research and, as a result, it is impossible to accurately predict all of the possible risks and discomforts that you might experience,” the 24-page consent form tells users. Later it specifies some possible risks: “The data that you provide to the PGP may be used, on its own or in combination with your previously shared data, to identify you as a participant in otherwise private and/or confidential research.”
Volunteers take an online exam about the risks they face before they are allowed into the program. And the test does not pose a universal ‘you do understand the risks” question. It has 20 questions and he requires a perfect score. Potential volunteers can take the test as many times as they want until they pass. One person took the test 90 times before passing.
Given what Church sees as the flaws in preserving privacy in the Internet age, he has embraced openness about many aspects of his own history. On his personal home page he posts the exact coordinates of his home, his birthdate and parents, medical problems (heart attack, carcinoma, narcolepsy, dyslexia, pneumonia, motion sickness) and even a copy of the 1976 letter booting him out of Duke University for getting an F in his graduate major subject.
Many of the early participants in the Personal Genome Project share the same ‘let it all hang out’ ethos. Volunteer Steven Pinker, a well-known experimental psychologist and author of the 2011 book “The Better Angels of Our Nature,” posts his genome and a 1996 scan of his brain on his web page. He says even data as in depth as his genome and medical records does not provide especially deep insights into a person.
“There just isn’t going to be an ‘honesty gene’ or anything else that would be nearly as informative as a person’s behavior, which, after all, reflects the effect of all three billion base pairs and their interactions together with chance, environmental effects, and personal history,” he says. “As for the medical records, I just don’t think anyone is particularly interested in my back pain.”
Could companies use medical information to single out people to deny them services? Might a bank, for example, turn down a loan to someone because their health records suggest they may die at a young age? Even though Church expected reidentification of his volunteers, he does not think so. “These companies are not yet highly motivated to do that and probably judging from the way the winds blowing on the Genetic Information Nondiscrimination Act they would be ill advised to do that from a public relations standpoint,” he says, referring to the 2008 law.
In a different study released earlier this year, researcher Yaniv Erlich at the Whitehead Institute for Biomedical Research in Cambridge, Massachusetts, was also able to re-identify almost 50 people participating in a different genomic study. He said that he does not know of anyone who has suffered harm to date from such re-identifications, but pointed out the current ethical debate “emerged from the very bad history of the field in the first half of the 20th century, where bad genetic and abundance of records of familial genealogy contributed to one of the most horrific crimes.”
Misha Angrist, an assistant professor of the practice at the Duke Institute for Genome Sciences & Policy and one of the original ten to participate in the Personal Genome Project, praises the re-identification experiments by researchers such as Sweeney and Erlich. “It is a nuisance to scientists who are trying to operate under the status quo and to tell their participants with a straight face, you know, it’s very unlikely that you will be identified,” he says. “It is useful for pointing out that the emperor has no clothes, that absolute privacy and confidentiality are illusory.”

By Adam Tanner


Gaziantep Sanayi Odası (GSO) Meclis Başkanı Abdulkadir Konukoğlu, yeni dönemde meclis başkanlığına aday olmayacağı için oda üyelerine veda ederek, ‘’Herkes Gaziantep’e gıpta ediyor. Hepinize teşekkür ederim. Gaziantep sanayisini bugünlere getirdiniz’’ dedi.
GSO’nda 4 Mayıs Cumartesi günü gerçekleştirilecek seçimler öncesinde son Meclis Toplantısı yapıldı.
Son 15 yıllık başkanlığı döneminde kendisine verdikleri destekler için teşekkür eden Konukoğlu, dün Adana’da bir toplantıya katıldığını ifade ederek, şunları söyledi:
‘’Herkes Gaziantep’e gıpta ediyor. Bunun nedeni sizlersiniz arkadaşlar. Herkesten çatlak ses çıksa o şehir gelişmez. O yönden hepinize teşekkür ediyorum. Gaziantep sanayisini bugünlere getirdiniz. Sanayi Odamızın gelişmesinde hepinizin tuzu var. Bu son 15 yılı kavga etmeden, birbirimizi kırmadan ve Türkiye’ye bir emsal olarak götürdük.’’
GSO’nun her konuda birlikte hareket ettiğini, diğer şehirlerdeki odalara benzemediğini vurgulayan Abdulkadir Konukoğlu, Başbakan Recep Tayyip Erdoğan’ın ve bakanların nereye gitse Gaziantep’i örnek gösterdiğine dikkati çekti.
Gaziantep’in daha ileriye gideceğine eminim olduğunu anlatan Konukoğlu. Meclis’e yeni seçilecek üyelere başarılar diledi.
Kendisi ve birlikte görevini bırakan Meclis Başkan Yardımcıları Atilla Güner ve Mehmet Haratoğlu’nu ‘’üç silahşörlere’’ benzeten Konukoğlu, ‘’Üçümüzde gönlümüzle görevi size teslim ediyoruz. Biz belirli noktaya geldik, yarın sizlerde aynı olacaksınız. Önünüzü daima açarak gitmeliyiz ki, bu sanayi yukarı çıksın. El birliğiyle hareket edince hep Gaziantep kazanır’’ diye konuştu.
Yönetim Kurulu adına Meclis Başkanı ve Divan Heyetine teşekkürlerini sunan GSO Yönetim Kurulu Başkanı Adil Konukoğlu da, ‘’Bize duyduğunuz güvenle, bütün arkadaşlarımızla birlikte burayı daha ileriye götürebilmek için var gücümüzle çalışacağız, aldığımız bu bayrağı daha ileriye taşıyacağız’’ diyerek söz verdi.
Gaziantep ekonomisiyle ilgili değerlendirmede bulunan Konukoğlu, Şahinbey-Polateli Organize Sanayi Bölgesi ile ilgili çalışmaların devam ettiğini, ayrıca Havaalanı OSB oluşumu için Valilik tarafından kurulan komisyondan olur çıktığını ve bakanlığa başvuru aşamasına geldiğini bildirdi.
Konukoğlu, ‘’İnşallah bu yıl içinde OSB’ye dönüşmüş olur. Orada şu anda alan genişletme çalışmaları başlamış durumda’’ diyerek, Meclis üyelerine gelişmeleri aktardı.

Son dönemde Türkiye genelinde ihracatta bir yavaşlama olduğunu anımsatan Adil Konukoğlu, Gaziantep’in sürekli yüzde 10’un üzerinde büyümesine rağmen mart ihracatının önceki yılın aynı ayına göre yüzde 4,3 artışla kapattığını, ama birçok şehrin ihracatının gerilediğine işaret etti.
Gaziantep’in Ocak-Mart ihracat artışının yüzde 10,5 olduğuna dikkati çeken Konukoğlu, en fazla ihracat yapan 6. il konumunu koruduklarını vurguladı.
Sanayi ürünlerinin ihracatını artırmak için işadamlarına görevler düştüğünü belirten Konukoğlu, Gaziantep’ten Ocak-Mart 2013 döneminde yapılan ihracatta Ortadoğu ülkelerine yüzde 3.2, AB ülkelerine yüzde 7.4, Afrika ülkelerine yüzde 20.4, Amerika ülkelerine yüzde 3, Asya ve Okyanusya ülkelerine yüzde 55.8, Eski Doğu Bloku ülkelerine yüzde 19, Türk Cumhuriyetlerine yüzde 25.7 ve diğer Avrupa ülkelerine yüzde 20,3 artış yaşandığını söyledi.

Konukoğlu, ocak ayında bildirge veren işyeri sayısının yüzde 10.6 ve sigortalı sayısının yüzde 18.4 arttığını ifade ederken, ‘’Bundan dolayı bütün sanayici arkadaşlarımıza teşekkür ederiz. Kayıtlı ekonomiye geçmenin en güzel örnekleri’’ sözlerine yer verdi.
Meclis üyelerine veda eden Meclis Başkan Yardımcısı Atilla Güner de ‘’Bu Oda’nın kıymetini bilin. Biz, başkanla beraber geldik beraber gidiyoruz. Hepinize teşekkür ediyorum’’ dedi.
3 dönemdir görev yapan Mehmet Haratoğlu da teşekkürlerini sunarak, ‘’Abdulkadir Konukoğlu’na çok teşekkür ediyorum, mutlu şekilde birlikte görev yaptım. Adil Bey’e de çok teşekkür ediyorum. Bu Aile, Antep’e layık, insanlara, ülkesine önem veren bir aile. GSO, Türkiye’nin örnek odalarından birisi. Hepimiz birbirimize kardeşçe, sevgiyle, saygıyla bakıyoruz’’ diye konuştu.


OASIS Members to Advance MQTT Standard for M2M/ IoT Reliable Messaging

Cisco, Eclipse Foundation, Eurotech, IBM, Kaazing, Machine-To-Machine Intelligence (M2Mi), Red Hat, Software AG, TIBCO, and Others Partner to Standardize MQTT Protocol 

Organizations from around the world are collaborating at the OASIS open standards consortium to advance a lightweight reliable messaging transport protocol for the Machine-to-Machine (M2M)/Internet of Things (IoT) marketplace. The new OASIS Message Queuing Telemetry Transport (MQTT) Technical Committee will develop a standardized version of the MQTT protocol.
MQTT is a publish / subscribe messaging transport protocol optimized to connect physical world devices and events with enterprise servers and other consumers. It is designed to overcome the challenges of connecting the rapidly expanding physical world of sensors, actuators, phones, and tablets with established software processing technologies.
"MQTT is well suited to underpin the world of M2M/IoT and mobile applications where resources such as bandwidth and battery power are at a premium. It is an extremely lightweight, simple yet reliable protocol, designed for use where small code footprints are often advantageous," said Richard Coppen of IBM, who co-chairs the new OASIS MQTT Technical Committee, along with Raphael Cohn.
"At OASIS, we’ll be addressing IoT data flow challenges and other issues that have been identified by the community," added Cohn. "We'll broaden the range of enterprise solutions by enabling integration with business applications and expanding connectivity to different types of networks and remote devices."
MQTT is already widely implemented across a variety of industries and applications. For example, MQTT has been used in sensors communicating to a broker via satellite links, over occasional dial-up connections with healthcare providers (medical devices), and in a range of home automation and small device scenarios. MQTT is well suited for mobile applications because of its small size, minimized data packets, and efficient distribution of information to one or many receivers.
The MQTT specification has been contributed to the OASIS MQTT Technical Committee by IBM and Eurotech, authors of the original version of the protocol.
"We applaud IBM and Eurotech for bringing MQTT into the OASIS open standards process. Companies feel more confident about implementing the protocol when they can actively participate in its future," noted Laurent Liscia, OASIS CEO. "We're very pleased that the MQTT Technical Committee will operate the OASIS Non-Assertion Intellectual Property Rights mode, which will provide added reassurance for developers and promote widespread adoption."
New members are encouraged to join the OASIS MQTT Technical Committee at any time. Archives of the work are accessible to both members and non-members, and OASIS invites public review and comment on the work.

Source and read more:


BlackJet offers guaranteed private jet seat availability and the most affordable private jet charter solution. BlackJet booking technology provides confirmed seats on private jets in 10 seconds. BlackJet private jet seat service is a perfect complement to fractional jet ownership, jet card programs and private jet charter. BlackJet is not an owner or operator of aircraft. Flights are provided by professional aircraft operators who are authorized by the FAA and DOT to provide on demand air charter service. BlackJet manages the booking and quality control of the guaranteed private jet seat service for our members. 

Source and more:

With CISPA dead, a look ahead to the Senate’s coming work on cybersecurity

We’re back to where we were, amazingly. In 2012, we watched CISPA pass in the House, insufficiently amended to protect the privacy of the average citizen. The President threatened to veto the bill. The Senate worked on its own law, all but ignoring what the House had produced.
And in 2013, the same tune is being hummed. CISPA has passed the House, again. The President threatened to veto the bill, again. And the Senate is, again, ignoring CISPA in favor of working on its own bill. Progress! I’m kidding. And frankly, if you were not impressed with the Senate’s efforts last year, I doubt that what we are going to see in 2013 will hearten you.
Let’s retread. Then retiring Senator Lieberman called the issue key, and threw his final days in office into working on it. Despite that, and massive pressure from both the media and Executive branch, the Senate failed to pass a bill. To recap: The House produced a bill that could not pass, and the Senate didn’t even get that far.
The issue slid into the hands of the next Congress, our current set of elected officials. The amendment process failed to create a bill that garnered 60 votes. From right before it failed:
After pressure from Sen. McCain and others, the bill’s security mandates for critical infrastructure were stripped. However, even with that, compromise fell short. Over 200 amendments were filed change to the bill. That prompted Senator Lieberman to actually publicly castigate his peers for being off topic.
Its final failure:
[T]oday’s vote on the leading cybersecurity bill in the US Senate failed 52-46. While the measure did garner a simple majority, it did not hit the 60 vote threshold required to break a fillibuster. Thus, given the current operational paradigm of the Senate, it failed.
What happened? A short look backwards will explain what we’re likely to see over the next few months.

Mandatory Standards

If you had to explain why the Senate failed to pass a bill, the short answer is this: mandatory security standards for critical infrastructure. The Senate majority wanted them, whilst the other American political party did not.
In short, to many, the addition of any new regulation on business is unacceptable. This led to horsetrading around creating incentives to promote the meeting of cyber standards by critical infrastructure elements, but not out-and-out mandates. That wasn’t enough. It appears unlikely that the climate has changed in the Senate.
Given that, the key philosophical difference, no clear path exists for cybersecurity to advance in the Senate. Unless compromise can find legs that it lacked last year, progress will be a combination of halting and grinding. And that is if we see any progress at all.
Either mandatory standards are stripped from any Senate bill, or it may stand no chance of passage. Still, the President favored the standards before, so he will likely press for them again. This puts Senate Democrats in a hard place; is it better to pass something, even if it is only a fraction of their vision?
CISPA is behind us. The President’s cybersecurity executive order is behind us. It’s time for the Senate to stand up and lead.

By Alex Wilhelm

Draft Special Publication 800-162, Guide to Attribute Based Access Control (ABAC) Definition and Considerations

NIST announces the public comment release of draft Special Publication (SP) 800-162, Guide to Attribute Based Access Control (ABAC) Definition and Considerations. ABAC is a logical access control methodology where authorization to perform a set of operations is determined by evaluating attributes associated with the subject, object, requested operations, and, in some cases, environment conditions against policy, rules, or relationships that describe the allowable operations for a given set of attributes. This document provides Federal agencies with a definition of ABAC and considerations for using ABAC to improve information sharing within organizations and between organizations while maintaining control of that information.

NIST requests comments on draft SP 800-162 by May 31, 2013. Please send comments to vincent.hu@nist.gov with the subject "Comments SP 800-162".


Second Cybersecurity Infrastructure Framework Workshop Gathers May 29-31, 2013

The National Institute of Standards and Technology (NIST) is holding the second of four planned workshops to develop a voluntary framework to reduce cybersecurity risks for critical infrastructure from May 29-31, 2013, at Carnegie Mellon University in Pittsburgh, Pa. The hands-on workshop is open to cybersecurity industry experts in all sectors—such as energy, finance, transportation and communications—as well as government and academic stakeholders.
In February 2013, the President issued the "Improving Critical Infrastructure Cybersecurity" Executive Order, which gave NIST the responsibility to develop the cybersecurity framework with industry input. NIST first issued a Request for Information (RFI) on this topic and received hundreds of comments* in April, 2013.
The first Cybersecurity Framework Workshop in early April brought together stakeholders to hear perspectives from industry leaders, reviews of the threat environment from Information Sharing and Analysis Centers in critical infrastructure fields, and to learn about government partnerships with the Department of Homeland Security and NIST. Streaming video of the workshop** is available online.
During the spring and summer workshops, attendees will be asked to roll up their sleeves to identify, refine and guide the many interrelated considerations, challenges and efforts needed to build the framework. Participants in the working sessions will analyze and discuss the framework's initial inputs, including the RFI responses and preliminary analysis conducted by NIST.
Participants in the May 29-31 event are requested to review the RFI comments and NIST analysis, which will be available on the workshop event page,*** before attending the meeting. Plenary sessions will be webcast; participants should check the event Web page in late May for information. Reports on breakout sessions will be available on the event page after the workshop.
The workshop is limited to 500 attendees due to space constraints, so early registration is suggested at https://www-s.nist.gov/CRS/conf_disclosure.cfm?conf_id=6269.
More information about the cybersecurity critical infrastructure framework project, the Executive Order, the NIST RFI and comments is available at www.nist.gov/itl/cyberframework.cfm.


Registration Open for HTML5 Training Course; Early Bird Rate through 6 May

Registration is open for a new session of the HTML5 training course. Experienced trainer Michel Buffa will cover the techniques developers and designers need to create great Web pages and apps. Topics include video, animations, forms, and APIs to create location-based services, and offline applications. Training starts 3 June and lasts six weeks; students receive a certificate upon course completion. Register before May 6 to benefit from the early bird rate. Learn more about W3DevCampus, the W3C online training for Web developers.


ENISA info film - Everything is connected


Government Seeks to Fine Companies for Not Complying With Wiretap Orders

It isn’t often that communications companies push back against government requests to monitor customers and hand over information about them, but a government task force is seeking to make it even harder for companies to say no.

The task force is pushing for legislation that would penalize companies like Google, Facebook and Skype that fail to comply with court orders for wiretapping, according to the Washington Post. The cost of non-complying would be an escalating series of fines, starting at tens of thousands of dollars. Fines that remained unpaid after 90 days would double daily.

Unlike telecommunications companies that are required under the 1994 Communications Assistance for Law Enforcement Act (CALEA) to have systems that are wiretap-enabled, some internet communication methods — such as social networking sites and online gaming sites — aren’t easily wiretapped and are not required to enable the capability under CALEA. Companies that argue that they don’t have the means to enable wiretapping have avoided complying with court orders seeking real-time surveillance, the paper notes. The legislation is intended to force these companies into finding technology solutions that would enable real-time surveillance.

Microsoft reportedly applied for a patent in 2009 for a technology called Legal Intercept that would have the ability to secretly monitor, intercept and record Skype calls. Microsoft filed for the patent before it bought Skype in 2011.

The push for legislation to compel these companies to cooperate with wiretapping orders began in 2010 after Google initiated end-to-end encryption for Gmail and text messages, which made it more difficult for the FBI to intercept e-mail under a court order, the Post notes.

But critics like Matt Blaze, professor of computer science at the University of Pennsylvania, have argued that the intercept capabilities introduce vulnerabilities (.pdf) that make it possible for foreign intelligence agencies and others to hijack the surveillance systems on communication networks and do their own spying.

The move to wiretap the internet isn’t new. The New York Times reported in 2010 that federal officials were seeking new regulations to wiretap the internet.

The piece noted that officials wanted legislation that would require all communications providers — including encrypted e-mail providers, like Google, social networking sites like Facebook, and messaging and voice services like Skype — to install the technical capability for wiretapping. Officials wanted these services to provide the ability not only to intercept and record communications but to and decrypt encrypted communications.

Officials argue that they’re not seeking new powers; they just want to extend the monitoring authority they currently have for telecommunications to other communication methods on the internet.
By Kim Zetter

KISSmetrics: Customer Web Analytics

Google Analytics tells you what happened, KISSmetrics tells you who did it.

KISSmetrics fills this gap by showing you every action each individual did. Finally, you'll know who your most valuable customers are and how to get more of them.


The growing importance of timing in data centers

Like a bad episode of Hoarders, people love to store all things digital, most of which will never be accessed again. And, like a bad episode of Storage Wars, our love of storing crap means we need more places to store it. Today’s content has outgrown even the hydro-electric dam powered Mega Data Centers built just yesteryear. Increasingly, operators are turning to distributing their information across multiple geographically dispersed data centers. As the number, size, and distances between the data centers have steadily grown, timing distribution and accuracy has likewise grown in importance in keeping the data centers in sync.
In a previous article I discussed new standards being developed to increase the accuracy of timing for the internet and other IP-based networks. Current systems and protocols offer milliseconds of accuracy. But that just isn’t enough as we depend more on real-time information and compute, storage and communications networks become more distributed. While people often cite the importance of timing on mobile backhaul networks in the next-genration LTE-Advanced networks,there has been less publicity around the need for these new timing technologies in the continued growth of data centers.

The rise of Hadoop in an age of digital garbage

Dinosaur image courtesy of Flickr user Denise Chen.

Massive storage of data appears to occur in periods, very analogous to dinosaur evolution. A database architecture will rise to the forefront, based upon its advantages, until it scales to the breaking point and is completely superseded by a new architecture. At first, databases were simply serial listed values with row/column arrangements. Database technology leapt forward and became a self-sufficient business with the advent of relational databases. It appeared for a while relational databases would be the end word in information storage, but then came Web 2.0, social media, and the cloud. Enter Hadoop.
A centralized database works, as the name suggests, by having all the data located in a single indexed repository with massive computational power to run operations on it. But a centralized database cannot hope to scale to the size needed by today’s cloud apps. Even if it could, the time needed to perform a single lookup would be unbearable to an end user at a browser window.
Hadoop de-centralizes the storage and lookup, as well as computational power. There is no index, per se. Content is distributed across a wide array of servers, each with their own storage and CPU’s, and the location and relation of each piece of data mapped. When a lookup occurs, the map is read, and all the pieces of information are fetched and pieced together again. The main benefit of Hadoop is scalability. To grow a database (and computational power), you simply keep adding servers and growing your map.

Even Hadoop is buried under mounds of digital debris

hadoop timing
It looked like Hadoop would reign supreme for generations to come, with extensions continuously breathing new life into the protocol. Yet, after only a decade, databases based upon Hadoop such as Facebook are at the breaking point. Global traffic is growing beyond exponential, and most of it is trash. Today’s databases look more like landfills than the great Jedi Archives. And recently hyped trends such as lifelogging suggest the problem will get much worse long before it gets better.
The main limitation of Hadoop is that it works great within the walls of a single massive data center, but is less than stellar once that database outgrows the walls of a single data center and has to be run across geographically separated databases. It turns out the main strength of Hadoop is also its Achilles heel. With no index to search, every piece of data must be sorted through, a difficult proposition once databases stretch across the globe. A piece of retrieved data might be stale by the time it reaches a requester, or mirrored copies of data might conflict with one another.
Enter an idea keep widely dispersed data centers in sync — Google True Time. To grossly oversimplify the concept, True Time API adds time attributes to data being stored, not just for expiration dating, but also so that all the geographically disparate data centers’ content can be time aligned. For database aficionados, this is sacrilegious, as all leading database protocols are specifically designed to ignore time to prevent conflicts and confusion. Google True Time completely turns the concept of data storage inside out.

Introducing Spanner

In True Time, knowing the accurate “age” of each piece of information, in other words where it falls on the timeline of data, allows data centers that may be 100ms apart to synchronize not just the values stored in memory locations, but the timeline of values in memory locations. In order for this to work, Google maintains an accurate “global wall-clock time” across their entire global Spanner network.
Transactions that write are time stamped and use strict two phase locking (S2PL) to manage access. The commit order is always the timestamp order. Both commit and timestamp orders respect global wall-clock time. This simple set of rules maintains coordination between databases all over the world.
However, there is an element of uncertainty introduced into each data field, the very reason that time has been shunned in database protocols since the dawn of the data itself.

Google calls this “network-induced uncertainty”, denoted with an epsilon, and actively monitors and tracks this metric. As of summer 2012, this value was running 10ms for 99.9 percent (3 nines) certainty. Google’s long term goal is to reduce this below 1ms. Accomplishing this will require a state of the art timing distribution network, leveraging the same technologies being developed and deployed for 4G LTE backhaul networks.

A modest proposal

While True Time was most likely developed to improve geographic load balancing, now that accurate time stamping of data exists, the possibilities are profound. The problems associated with large databases go beyond simply managing the data. The growth rate itself is unsustainable. Data storage providers must do more than grow their storage, they must also come up with ways to improve efficiencies and ebb the tsunami of waste that is common in the age of relatively free storage.
It’s a dangerous notion, one simply must challenge the basic tenet that all data is forever. Our minds don’t work that way, why should computers? We only hold on to key memories, and the further the time from an event, the fewer the details are held. Perhaps data storage could work similarly. Rather than delete a picture that hasn’t been accessed in a while, a search is performed for similar photos and then only one kept. And as time passes, perhaps rather than simple deletion, a photo is continuously compressed, with less information kept, until the photo memory fades into oblivion. Like that old Polaroid hung on the refrigerator door.

By Jim Theodoras

iTunes Store at 10: how Apple built a digital media juggernaut

After a decade of success, can Cupertino ride the next wave?

Ten years ago this month, a music sector ravaged by Napster and largely ignorant of digital distribution found a savior of sorts in what was then called the iTunes Music Store. With its 99-cent unbundled songs, the service quickly became the only significant source for acquiring music legally online.

With iTunes, Apple had drawn the blueprint for distributing music, movies, books, and apps over the web. By supplying and tying together a music player, online store, and song-mangement software, Apple drastically simplified the entire music experience, defying the odds to build a music-retailing dynasty even as file sharing skyrocketed. A decade ago, Apple started to answer what would become an all-important question: how do you get consumers to pay for content again?

"They invented the digital music business," said Michael Nash, the former digital chief at Warner Music Group. "Apple really created the convergence of music and technology and showed everyone what the connected economy around content looks like."

Now known simply as the iTunes Store, the music, movie, TV, book, and app marketplace celebrates its 10-year anniversary on April 28th. Few should be singing Happy Birthday with more zeal than those at the major entertainment companies. And now, as the iTunes Store enters its second decade, there’s a growing sentiment that iTunes has become bloated and stagnant, that Apple is resting on its laurels and failing to innovate while a new generation of music services begin to find an audience. Over the next ten years, will the company be able to evolve its longstanding business model and keep dominating in the face of upstart competitors?

A new model

In the early 2000s, web piracy began to mushroom and digital music services launched by the large record labels — MusicNet and Pressplay — were busts. There was a "scramble in the music industry to create a service to answer the marketplace," according to Paul Vidich, the former executive vice president of Warner Music Group and the first label exec to cut a licensing deal with iTunes.
In April of 2003, the iPod was already drawing intense consumer interest — Apple claimed the device was the number one MP3 player in the world with over 700,000 sold. (For context, Apple sold 5.6 million iPods in the last quarter, despite the continued downturn in the MP3 player market.) Getting content for that iPod, however, was a bit of a mess — it seemed nobody knew how to build an easy-to-use web music service. Napster may have shut down in late 2001, but P2P music sharing was already out of the bag — and the options for legally acquiring music online were poor.
However, Apple aimed to change all that with the iTunes Music Store, the first real a la carte download music service, built directly into the software. At launch, it was Mac-only and offered a relatively tiny catalog: 200,000 songs (it currently has 26 million). But it did have the support of the major record labels of the day: Universal, EMI, Warner, Sony, and BMG. The partnerships were key to helping Apple take control of music distribution — without the songs, the iPod was a nicely designed but empty box.
Steve Jobs' salesmanship helped close those deals. Vidich remembers the Apple CEO flying to New York to demonstrate iTunes for Warner execs sometime around September 2002. CEOs didn't often handle product demos but Jobs "was so exuberant about iTunes and its simplicity," Vidich said. "We were too. The other products out there just weren't simple to use."
Jobs certainly had his challenges. Vidich said he's the one who suggested that iTunes charge 99 cents per track and he remembers Jobs nearly hugged him. At the time, Sony Music execs wanted to charge more than $3 a track, according to Vidich. No doubt a $3 song price would have tied an anchor around iTunes' neck, stifling growth. 99 cents, on the other hand, was below the sub-$1 psychological barrier — and has continued to be an important price point for not only music but the wide swath of 99-cent iOS apps in the store.

Eventually all the major record companies signed one-year deals, and at the conclusion of the store’s first year, the labels found themselves captives. iTunes sales had grown so fast and the buzz was so electric that Jobs held all the leverage in subsequent negotiations. Apple sold one million songs in the first week and 10 million by September of 2003. In its first year, the company sold 50 million. "If you were in that space and you weren't supplying iTunes," Vidich said, "you weren't cool."
Apple’s most important move to make itself cool was to create of one of the most memorable ad campaigns of the last decade: the famous dancing silhouette ads.
In contrast to Apple’s typical spots focusing on simplicity and design elements, its iPod / iTunes ads featured loud music, quick movements — and no clear shot of the product itself. The ads sold you music and white earbuds, with the rest left to your imagination. When the iPod was first released, those white headphones were an exotic rarity that pointed to someone carrying a piece of bleeding edge technology — but after Apple’s silhouette commercials, they soon became a constant reminder of Apple’s dominance in the music world.
By Nathan Ingraham 
Source and read more:

Fifty Years of Clearing the Skies

A Milestone in Environmental Science
Ringed by mountains and capped by a temperature inversion that traps bad air, Los Angeles has had bouts of smog since the turn of the 20th century. An outbreak in 1903 rendered the skies so dark that many people mistook it for a solar eclipse. Angelenos might now be living in a state of perpetual midnight—assuming we could live here at all—were it not for the work of Caltech Professor of Bio-organic Chemistry Arie Jan Haagen-Smit. How he did it is told here largely in his own words, excerpted from Caltech's Engineering & Science magazine between 1950 and 1962. (See "Related Links" for the original articles.)
Old timers, which in California means people who have lived here some 25 years, will remember the invigorating atmosphere of Los Angeles, the wonderful view of the mountains, and the towns surrounded by orange groves. Although there were some badly polluted industrial areas, it was possible to ignore them and live in more pleasant locations, especially the valleys . . . Just 20 years ago, the community was disagreeably surprised when the atmosphere was filled with a foreign substance that produced a strong irritation of the eyes. Fortunately, this was a passing interlude which ended with the closing up of a wartime synthetic rubber plant. (November 1962)
Alas, the "interlude" was an illusion. In the years following World War II, visibility often fell to a few blocks. The watery-eyed citizenry established the Los Angeles County Air Pollution Control District (LACAPCD) in 1947, the first such body in the nation. The obvious culprits—smoke-belching power plants, oil refineries, steel mills, and the like—were quickly regulated, yet the problem persisted. Worse, this smog was fundamentally different from air pollution elsewhere—the yellow, sulfur-dioxide-laced smog that killed 20 people in the Pennsylvania steel town of Donora in 1948, for example, or London's infamous pitch-black "pea-soupers," where the burning of low-grade, sulfur-rich coal added soot to the SO2. (The Great Smog of 1952 would carry off some 4,000 souls in four days.) By contrast, L.A.'s smog was brown and had an acrid odor all its own.
Haagen-Smit had honed his detective skills isolating and identifying the trace compounds responsible for the flavors of pineapples and fine wines, and in 1948 he began to turn his attention to smog.
Chemically, the most characteristic aspect of smog is its strong oxidizing action . . . The amount of oxidant can readily be determined through a quantitative measurement of iodine liberated from potassium iodide solution, or of the red color formed in the oxidation of phenolphthalin to the well-known acid-base indicator, phenolphthalein. To demonstrate these effects, it is only necessary to bubble a few liters of smog air through the colorless solutions. (December 1954)
His chief suspect was ozone, a highly reactive form of oxygen widely used as a bleach and a disinfectant. It's easy to make—a spark will suffice—and it's responsible for that crisp "blue" odor produced by an overloaded electric motor. But there was a problem:
During severe smog attacks, ozone concentrations of 0.5 ppm [parts per million], twenty times higher than in [clean] country air, have been measured. From such analyses the quantity of ozone present in the [Los Angeles] basin at that time is calculated to be about 500 tons.
Since ozone is subject to a continuous destruction in competition with its formation, we can estimate that several thousand tons of ozone are formed during a smog day. It is obvious that industrial sources or occasional electrical discharges do not release such tremendous quantities of ozone. (December 1954)
If ozone really was to blame, where was it coming from? An extraordinary challenge lay ahead:
The analysis of air contaminants has some special features, due to the minute amounts present in a large volume of air. The state in which these pollutants are present—as gases, liquids and solid particles of greatly different sizes—presents additional difficulties. The small particles of less than one micron diameter do not settle out, but are in a stable suspension and form so-called aerosols.
The analytical chemist has devoted a great deal of effort to devising methods for the collection of this heterogeneous material. Most of these methods are based on the principle that the particles are given enough speed to collide with each other or with collecting surfaces . . . A sample of Los Angeles' air shows numerous oily droplets of a size smaller than 0.5 micron, as well as crystalline deposits of metals and salts . . . When air is passed through a filter paper, the paper takes on a grey appearance, and extraction with organic solvents gives an oily material. (December 1950)
Haagen-Smit suspected that this oily material, a complex brew of organic acids and other partially oxidized hydrocarbons, was smog's secret ingredient. In 1950, he took a one-year leave of absence from Caltech to prove it, working full-time in a specially equipped lab set up for him by the LACAPCD. By the end of the year, he had done so.
Through investigations initiated at Caltech, we know that the main source of this smog is due to the release of two types of material. One is organic material—mostly hydrocarbons from gasoline—and the other is a mixture of oxides of nitrogen. Each one of these emissions by itself would be hardly noticed. However, in the presence of sunlight, a reaction occurs, resulting in products which give rise to the typical smog symptoms. The photochemical oxidation is initiated by the dissociation of NO2 into NO and atomic oxygen. This reactive oxygen attacks organic material, resulting in the formation of ozone and various oxidation products . . . The oxidation reactions are generally accompanied by haze or aerosol formation, and this combination aggravates the nuisance effects of the individual components of the smog complex. (November 1962)
Professor of Plant Physiology Frits Went was also on the case. Went ran Caltech's Earhart Plant Research Laboratory, which he proudly called the "phytotron," by analogy to the various "trons" operated by particle physicists. (Phyton is the Greek word for plant.) "Caltech's plant physiologists happen to believe that the phytotron is as marvellously complicated as any of the highly-touted 'atom-smashing' machines," Went wrote in E&S in 1949. "[It] is the first laboratory in the world in which plants can be grown under every possible climatic condition. Light, temperature, humidity, gas content of the air, wind, rain, and fog—all these factors can be simultaneously and independently controlled. The laboratory can create Sacramento Valley climate in one room and New England climate in another." Most of Los Angeles was still orchards and fields instead of tract houses, and the smog was hurting the produce. Went, the LACAPCD, and the UC Riverside agricultural station tested five particularly sensitive crops in the phytotron, Haagen-Smit wrote.
The smog indicator plants include spinach, sugar beet, endive, alfalfa and oats. The symptoms on the first three species are mainly silvering or bronzing of the underside of the leaf, whereas alfalfa and oats show bleaching effects. Some fifty compounds possibly present in the air were tested on their ability to cause smog damage—without success. However, when the reaction products of ozone with unsaturated hydrocarbons were tried, typical smog damage resulted. (December 1950)
And yet a third set of experiments was under way. Rubber tires were rotting from the smog at an alarming rate, cracking as they flexed while rolling along the road. Charles E. Bradley, a research associate in biology, turned this distressing development into a cheap and effective analytical tool by cutting rubber bands by the boxful into short segments. The segments—folded double, secured with a twist of wire, and set outside—would start to fall apart almost before one could close the window. "During severe smog initial cracking appears in about four minutes, as compared to an hour or more required on smog-free days, or at night," Haagen-Smit wrote in the December 1954 E&S.
The conclusion that airborne gasoline and nitrogen oxides (another chief constituent of automobile exhaust) were to blame for smog was not well received by the oil refineries, who hired their own expert to prove him wrong. Abe Zarem (MS '40, PhD '44), the manager and chairman of physics research for the Stanford Research Institute, opined that stratospheric ozone seeping down through the inversion layer was to blame. But seeing (or smelling) is believing, so Haagen-Smit fought back by giving public lectures in which he would whip up flasks full of artificial smog before the audience's eyes, which would soon be watering—especially if they were seated in the first few rows. By the end of his talk, the smog would fill the hall, and he became known throughout the Southland as Arie Haagen-Smog.
By 1954, he and Frits Went had carried the day.
[Plant] fumigations with the photochemical oxidation products of gasoline and nitrogen dioxide (NO2) was the basis of one of the most convincing arguments for the control of hydrocarbons by the oil industry. (December 1954)
It probably didn't hurt that an outbreak that October closed schools and shuttered factories for most of the month, and that angry voters were wearing gas masks to protest meetings. By then, there were some two million cars on the road in the metropolitan area, spewing a thousand tons of hydrocarbons daily.
Incomplete combustion of gasoline allows unburned and partially burned fuel to escape from the tailpipe. Seepage of gasoline, even in new cars, past piston rings into the crankcase, is responsible for 'blowby' or crankcase vent losses. Evaporation from carburetor and fuel tank are substantial contributions, especially on hot days. (November 1962)
Haagen-Smit was a founding member of California's Motor Vehicle Pollution Control Board, established in 1960. One of the board's first projects was testing positive crankcase ventilation (PCV) systems, which sucked the blown-by hydrocarbons out of the crankcase and recirculated them through the engine to be burned on the second pass. PCV systems were mandated on all new cars sold in California as of 1963. The blowby problem was thus easily solved—but, as Haagen-Smit noted in that same article, it was only the second-largest source, representing about 30 percent of the escaping hydrocarbons.
The preferred method of control of the tailpipe hydrocarbon emission is a better combustion in the engine itself. (The automobile industry has predicted the appearance of more efficiently burning engines in 1965. It is not known how efficient these will be, nor has it been revealed whether there will be an increase or decrease of oxides of nitrogen.) Other approaches to the control of the tailpipe gases involve completing the combustion in muffler-type afterburners. One type relies on the ignition of gases with a sparkplug or pilot-burner; the second type passes the gases through a catalyst bed which burns the gases at a lower temperature than is possible with the direct-flame burners. (November 1962)
Installing an afterburner in the muffler has some drawbacks, not the least of which is that the notion of tooling around town with an open flame under the floorboards might give some people the willies. Instead, catalytic converters became required equipment on California cars in 1975.
In 1968, the Motor Vehicle Pollution Control Board became the California Air Resources Board, with Haagen-Smit as its chair. He was a member of the 1969 President's Task Force on Air Pollution, and the standards he helped those two bodies develop would eventually be adopted by the Environmental Protection Agency, established in 1970—the year that also saw the first celebration of Earth Day. It was also the year when ozone levels in the Los Angeles basin peaked at 0.58 parts per million, nearly five times in excess of the 0.12 parts per million that the EPA would declare to be safe for human health. This reading even exceeded the 0.5 ppm that Haagen-Smit had measured back in 1954, but it was a triumph nonetheless—the number of cars in L.A. had doubled, yet the smog was little worse than it had always been. That was the year we turned the corner, in fact, and our ozone levels have been dropping ever since—despite the continued influx of cars and people to the region.
Haagen-Smit retired from Caltech in 1971 as the skies began to clear, but continued to lead the fight for clean air until his death in 1977—of lung cancer, ironically, after a lifetime of cigarettes. Today, his intellectual heirs, including professors Richard Flagan, Mitchio Okumura, John Seinfeld, and Paul Wennberg, use analytical instruments descended from ones Haagen-Smit would have recognized and computer models sophisticated beyond his wildest dreams to carry the torch—a clean-burning one, of course—forward.

By Douglas Smith

NeuroSky: Brainwave Sensors For Everybody

It is NeuroSky’s duty to bridge the gap between technology and the human body. Our research-grade instruments measure brainwave EEG and heartbeat ECG signals to entertain, relax, detect, heal, and exercise the most vital of our organs. With our partners, we make the intangible, tangible. That’s why, at NeuroSky, we make
Bio-Sensors for Every Body.



Samsung Demos a Tablet Controlled by Your Brain

An easy-to-use EEG cap could expand the number of ways to interact with your mobile devices.
Samsung mind control device
One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices.
In collaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas, Samsung researchers are testing how people can use their thoughts to launch an application, select a contact, select a song from a playlist, or power up or down a Samsung Galaxy Note 10.1. While Samsung has no immediate plans to offer a brain-controlled phone, the early-stage research, which involves a cap studded with EEG-monitoring electrodes, shows how a brain-computer interface could help people with mobility issues complete tasks that would otherwise be impossible.
Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).
Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).
To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.
Robert Jacob, a human-computer interaction researcher at Tufts University, says the project fits into a broader effort by researchers to find more ways for communicating with small devices like smartphones. “This is one of the ways to expand the type of input you can have and still stick the phone in the pocket,” he says.
Finding new ways to interact with mobile devices has driven the project, says Insoo Kim, Samsung’s lead researcher. “Several years ago, a small keypad was the only input modality to control the phone, but nowadays the user can use voice, touch, gesture, and eye movement to control and interact with mobile devices,” says Kim. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”
Jafari’s research is addressing another challenge—developing more convenient EEG sensors. Classic EEG systems have gel or wet contact electrodes, which means a bit of liquid material has to come between a person’s scalp and the sensor. “Depending on how many electrodes you have, this can take up to 45 minutes to set up, and the system is uncomfortable,” says Jafari. His sensors, however, do not require a liquid bridge and take about 10 seconds to set up, he says. But they still require the user to wear a cap covered with wires.
The concept of a dry EEG is not new, and it can carry the drawback of lower signal quality, but Jafari says his group is improving the system’s processing of brain signals. Ultimately, if reliable EEG contacts were convenient to use and slimmed down, a brain-controlled device could look like “a cap that people wear all day long,” says Jafari.
Kim says the speed with which a user of the EEG-control system can control the tablet depends on the user. In the team’s limited experiments, users could, on average, make a selection once every five seconds with an accuracy ranging from 80 to 95 percent.
“It is nearly impossible to accurately predict what the future might bring,” says Kim, “but given the broad support for initiatives such as the U.S. BRAIN initiative, improvements in man-machine interfaces seem inevitable” (see “Interview with BRAIN Project Pioneer: Miyoung Chun”).

By Susan Young