From Facebook: Proposed Updates to our Governing Documents

We are proposing updates to two important legal documents – our Data Use Policy and our Statement of Rights and Responsibilities. These two documents tell you about how we collect and use data, and the rules that apply when you choose to use Facebook. From time to time we update these documents to make sure we keep you posted about the latest things you can do with Facebook.


Statement of Rights and Responsibilities


As part of this proposed update, we revised our explanation of how things like your name, profile picture and content may be used in connection with ads or commercial content to make it clear that you are granting Facebook permission for this use when you use our services. We are proposing this update as part of a settlement in a court case relating to advertising and we hope this clarification helps you understand how we use your information in this way, so we included an example of how these ads work.


We also made a few other updates to make sure you understand that Facebook apps may be updated from time to time, and that although Facebook offers its services for free, you are responsible for any access fees, like data charges and text messages.


Data Use Policy


We also are proposing some updates to our Data Use Policy. Some of the key updates include:


  • Your information. We clarified that you share information with Facebook when you communicate with us, like when you send us an email.
  • Other information we receive about you. We simplified the explanation for how we receive information and clarified the types of information we receive when you use or run Facebook, including from your devices, such as your IP address or mobile phone number.
  • Personalized ads. We rewrote the entire advertising section to better explain what we thought was important for people to know about how we use the information we receive to provide relevant ads to people on and off Facebook.


In addition to the information we provide here, you may also review a section-by-section summary of updates for both of these documents, which provides more detail about the proposed changes. And, to see exact edits – whether substantial or just grammatical corrections – view the “tracked changes” English version.


Please read through these materials and provide feedback within the next seven days, by leaving comments below. As always, we will carefully consider your feedback before adopting any changes and we will post updates on the Site Governance page throughout the process.


For regular updates on our products and policies, please visit and like our Privacy and Site Governance pages.

By Erin Egan
Source and more details:
https://www.facebook.com/notes/facebook-site-governance/proposed-updates-to-our-governing-documents/1015316739594530

NIST Ytterbium Atomic Clocks Set Record for Stability

A pair of experimental atomic clocks based on ytterbium atoms at the National Institute of Standards and Technology (NIST) has set a new record for stability. The clocks act like 21st-century pendulums or metronomes that could swing back and forth with perfect timing for a period comparable to the age of the universe.
NIST physicists report in the Aug. 22 issue of Science Express that the ytterbium clocks' tick is more stable than any other atomic clock.* Stability can be thought of as how precisely the duration of each tick matches every other tick. The ytterbium clock ticks are stable to within less than two parts in 1 quintillion (1 followed by 18 zeros), roughly 10 times better than the previous best published results for other atomic clocks.
This dramatic breakthrough has the potential for significant impacts not only on timekeeping, but also on a broad range of sensors measuring quantities that have tiny effects on the ticking rate of atomic clocks, including gravity, magnetic fields, and temperature. And it is a major step in the evolution of next-generation atomic clocks under development worldwide, including at NIST and at JILA, the joint research institute operated by NIST and the University of Colorado Boulder.
"The stability of the ytterbium lattice clocks opens the door to a number of exciting practical applications of high-performance timekeeping," NIST physicist and co-author Andrew Ludlow says.
Each of NIST's ytterbium clocks relies on about 10,000 rare-earth atoms cooled to 10 microkelvin (10 millionths of a degree above absolute zero) and trapped in an optical lattice—a series of pancake-shaped wells made of laser light. Another laser that "ticks" 518 trillion times per second provokes a transition between two energy levels in the atoms. The large number of atoms is key to the clocks' high stability.
The ticks of any atomic clock must be averaged for some period to provide the best results. One key benefit of the very high stability of the ytterbium clocks is that precise results can be achieved very quickly. For example, the current U.S. civilian time standard, the NIST-F1 cesium fountain clock, must be averaged for about 400,000 seconds (about five days) to achieve its best performance. The new ytterbium clocks achieve that same result in about one second of averaging time.
Given this high level of stability the ytterbium clocks can make measurements extremely rapidly—in real time in many cases—which could be important in rapidly changing application settings, such as the factory floor and the natural environment.
A key advance enabling the milestone performance of the ytterbium clocks was the recent construction of a second version of the clock to measure and improve the performance of the original, developed since 2003. Along the way, NIST scientists have made several improvements to both clocks, including the development of an ultra-low-noise laser used to excite the atoms, and the discovery of a method to cancel disruptive effects caused by collisions between atoms.
The ytterbium clocks' stability record is different from the performance levels previously publicized for NIST-F1, which is traceable to the international system of units, and NIST experimental optical clocks based on single ions, such as the aluminum quantum logic clock or the mercury ion clock.** NIST-F1 and the ion clocks were evaluated based on systematic uncertainty, another important metric for standard atomic clocks. NIST-F1's performance is described in terms of accuracy, which refers to how closely the clock realizes the cesium atom's known frequency, or natural vibration rate. Accuracy is crucial for time measurements that must be traced to a primary standard.
NIST scientists plan to measure the accuracy of the ytterbium clocks in the near future, and the accuracy of other high performance optical atomic clocks is under study at NIST and JILA. The research is funded in part by the Defense Advanced Research Projects Agency and the National Aeronautics and Space Administration (NASA).
*N. Hinkley, J.A. Sherman, N.B. Phillips, M. Schioppo, N.D. Lemke, K. Beloy, M. Pizzocaro, C.W. Oates, A.D. Ludlow. An atomic clock with 10-18 instability. Science Express, Aug. 22, 2013.
**See 2010 NIST press release, "NIST's Second 'Quantum Logic Clock' Based on Aluminum Ion is Now World's Most Precise Clock," at www.nist.gov/pml/div688/logicclock_020410.cfm.
Source:
http://www.nist.gov/pml/div688/clock-082213.cfm

The FTC and the New Common Law of Privacy

Abstract:
One of the great ironies about information privacy law is that the primary regulation of privacy in the United States has barely been studied in a scholarly way. Since the late 1990s, the Federal Trade Commission (FTC) has been enforcing companies’ privacy policies through its authority to police unfair and deceptive trade practices. Despite more than fifteen years of FTC enforcement, there is no meaningful body of judicial decisions to show for it. The cases have nearly all resulted in settlement agreements. Nevertheless, companies look to these agreements to guide their privacy practices. Thus, in practice, FTC privacy jurisprudence has become the broadest and most influential regulating force on information privacy in the United States – more so than nearly any privacy statute and any common law tort.

In this article, we contend that the FTC’s privacy jurisprudence is the functional equivalent to a body of common law, and we examine it as such. We explore how and why the FTC, and not contract law, came to dominate the enforcement of privacy policies. A common view of the FTC’s privacy jurisprudence is that it is thin, merely focusing on enforcing privacy promises. In contrast, a deeper look at the principles that emerge from FTC privacy “common law” demonstrates that the FTC’s privacy jurisprudence is quite thick. The FTC has codified certain norms and best practices and has developed some baseline privacy protections. Standards have become so specific they resemble rules. We contend that the foundations exist to develop this “common law” into a robust privacy regulatory regime, one that focuses on consumer expectations of privacy, that extends far beyond privacy policies, and that involves a full suite of substantive rules that exist independently from a company’s privacy representations.


Authors: Daniel J. Solove, Woodrow Hartzog
Source and read the full article:
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2312913

The Innovation of Loneliness

One of the really interesting things in the world of social networking is how every now and then you bump into a specific resource, an online video clip, a blog post, a mainstream news article, a dissertation or whatever else, that is just so thought-provoking and rather mind-boggling in challenging your own notion, experiences, know-how, skills and expertise on the topic of social networks that surely makes you think about your own experiences twice, to the point where it makes you pause and think, really well, whether you are doing it right or not (and whatever that means with “doing it right“). Gary Hamel calls it “changing the way we change“. I call it growing-up, although my notion of growing-up is a completely different kind of growing-up than the one Euan Semple blogged about just recently as well. Indeed, welcome to the disturbing world of Loneliness!
Euan, in a rather inspirational short blog post, puts together all of his hopes around the Social Web (and our societies, for that matter) on those people who “are brave enough to be naive, foolish, enthusiastic, and open – because maybe that is how real grown ups should be?“. I would put my hopes on those, too, although I would also include those who are always open to exercising plenty of critical thinking, in a constructive manner, of course, but always willing to challenge the status quo to help us move forward, to progress further, to grow, to keep learning, to understand how we should strive for avoiding becoming a commodity and thrive in the new Creativity Economy (Yes, another economy to add into the mix) as empowered knowledge (Web) workers.
And then you bump into absolutely stunning video clips like the one put together by Shimi Cohen on the Innovation of Loneliness (Vimeo link, if interested) that starts off with a rather evocative and refreshing question that I doubt most of us out there on social networks have ever even dared to ask ourselves: “What is the connection between Social Networks and Being Lonely?
I am not going to spoil the contents of the video for you. Not even going to give you a teaser or two, like I typically do to entice you all to go and watch through it. This time is different. This time I am too shocked to even muse about what I thought about some of the key messages coming through after I watched it. It’s a little bit over 4 minutes long, and it’s one of those wonderfully troubling videos that would not leave you indifferent. On the contrary.
It will help you question, and big time!, not only your own notions of what social networks are all about, but the role we all play in them. I can tell you that after I watched it I just couldn’t utter a single trend of thought that would be meaningful enough to share across other than “WOW!! Is this really where we are heading with our very own social networking experiences out there on the Social Web? Please tell me it isn’t. Please tell me we are aiming higher, bigger, better, because otherwise I know we are going to be in trouble, in deep trouble altogether“. If not, judge for yourselves on what I mean and watch through the video clip itself below. Let me know what you think in the comments. Yes, I know, I’m, too, still coming to terms with it myself on how brilliantly poignant it is altogether:
VN:F [1.9.4_1102]

Anatomy of a hack: How the SEA took down the NYT and Twitter

The New York Times and Twitter’s UK site went offline for some users on Tuesday as part of an attack that the Syrian Electronic Army took credit for. The SEA is a pro-Syrian leader Bashar al-Assad hacktivist group, but it’s taking a less common route to taking down web sites — it’s attacking the domain name system.
Most public attacks against web sites have been denial-of-service attacks, where the attackers gather a massive array of computers to ping the servers of their target, overwhelming them. But recently, attacks — including denial-of-service attacks — have been hitting the Domain Name System, sensing a weak spot.
To get a sense of what is happening in a typical DNS attack, I emailed Cory von Wallenstein — he’s the CTO of Dyn, a company that provides cloud-based DNS services. Companies use Dyn to bypass the general DNS servers run by their own ISPs, with the idea that using such a service makes their web traffic (both inbound and outbound) faster and more secure.
Von Wallenstein explained that there are three types of attacks that escalate in complexity. The first is called a cache poisoning attack. In an email, von Wallenstein described it like this:
In that attack, hackers attempt to inject malicious DNS data into the recursive DNS servers that are operated by many ISPs. These DNS servers are typically the “closest” to users from a network topology perspective, so the damage is localized to specific users connecting to those servers.
Standards like DNSSEC can help protect against these types of attacks, and this wasn’t the type of attack used Tuesday afternoon. The second type is to take over one or more authoritative DNS servers for a domain and change the DNS data. Authoritative DNS servers are those that keep a list of addresses configured by an original source or an administrator on their behalf. Dyn does this for Twitter, for example.
Von Wallenstein said that if an attacker were to compromise authoritative DNS, the effect would be global — however, to do this, one would have to get past a company like Dyn or OpenDNS that have built good security practices including good social engineering training. This also wasn’t the type of attack used by the SEA against Twitter and the NYT.
According to von Wallenstein, the third form of attack — and the one used by the SEA on Tuesday — is to take over the registration of a domain and change the authoritative DNS servers. The attack isn’t on the domain name system, but on the registrars, in this case MelbourneIT. It’s the most time consuming attack to undo, because while you can make the changes to the authoritative DNS servers pretty quickly, the recursive DNS servers can cache information for a full day unless the operators perform a manual purge.
For huge sites like Twitter, the New York Times and The Huffington Post, ISPs are likely to notice the attack and make the effort to clear their DNS servers’ cache, but if an attack of this nature takes out a smaller site it could leave them down for a day or even longer. And if the SEA’s recent activity is any guide, we could see a lot more of these types of attacks.

By Stacey Higginbotham
Source:
http://gigaom.com/2013/08/27/anatomy-of-a-hack-how-the-sea-took-down-the-nyt-and-twitter/

Intel Pocket Avatars Beta Test

pocketavatar
Image Credit: Intel

Welcome to the Experience Trial for Pocket Avatars, the new unique personal messaging app from Intel to express yourself to your friends and family! Pocket Avatars is a unique messaging application that creates a speaking Avatar using your facial expressions for sending messages to your friends and family.
· Pocket Avatars provides you with several different avatar models (each with different use features and personalities) where each one will interact in a different and fun way depending on who you are.· No two messages can ever be the same with Pocket Avatars, in contrast to today's text and emoticon based messages.· You can play with the various avatar models to control and animate them just by using your own facial gestures (i.e., we have essentially replaced the use of hand-held game controller buttons and joysticks with your own real-time facial movements).If you are selected to participate in the beta, we will contact you at the email address you provide with an invite to TestFlight (http://testflightapp.com), where you will be able to register your device allowing us to get you the app.

Source and more:
https://docs.google.com/forms/d/1q8Yk09EFZQ09q--xdnyJ2kjPybVwW5Wfz_LJuVkcuM8/viewform

Dust: A Blocking-Resistant Internet Transport Protocol


Abstract

Censorship of information on the Internet has been an increasing problem as the methods have become more sophisticated and increasing resources have been allocated to censor more content. A number of approaches to counteract Internet censorship have been implemented, from censorship resistant publishing systems to anonymizing proxies. A prerequisite for these systems to function against real attackers is that they also offer blocking resistance. Dust is proposed as a blocking-resistant Internet protocol designed to be used alone or in conjunction with existing systems to resist a number of attacks currently in active use to censor Internet communication. Unlike previous work in censorship resistance, it does not seek to provide anonymity in terms of unlinkability of sender and receiver. Instead it provides blocking resistance against the most common packet filtering techniques currently in use to impose Internet censorship.
By Brandon Wiley
Source and read the full text:
http://blanu.net/Dust.pdf

Sight over sound in the judgment of music performance

Abstract

Social judgments are made on the basis of both visual and auditory information, with consequential implications for our decisions. To examine the impact of visual information on expert judgment and its predictive validity for performance outcomes, this set of seven experiments in the domain of music offers a conservative test of the relative influence of vision versus audition. People consistently report that sound is the most important source of information in evaluating performance in music. However, the findings demonstrate that people actually depend primarily on visual information when making judgments about music performance. People reliably select the actual winners of live music competitions based on silent video recordings, but neither musical novices nor professional musicians were able to identify the winners based on sound recordings or recordings with both video and sound. The results highlight our natural, automatic, and nonconscious dependence on visual cues. The dominance of visual information emerges to the degree that it is overweighted relative to auditory information, even when sound is consciously valued as the core domain content.

By Chia-Jung Tsay
Source and read the full text:
http://www.pnas.org/content/early/2013/08/16/1221454110.abstract

Can Humans Breathe Liquid?

Deep water and the unprotected human body don't play well together—like, at all. But what if there were a way to get around the body's chemical limitations, a means of deep diving without the bends or lengthy decompression? Actually, there is. And we've almost figured out how to do it without killing ourselves in the process.

The Dangers of Deep

The recommended absolute limit for recreational SCUBA divers is just 130 feet, and technical dives using Trimix bottom out at 330. Even then, you've got less than five minutes at depth before requiring monitored decompression to avoid getting the bends (the not-scary word for when nitrogen dissolves into your tissue under the massive pressure of the water column, is ejected into the bloodstream during ascent, and you die of a brain embolism). Interestingly though, once your body hits its nitrogen saturation limit, it doesn't matter if you stay down for an hour or a month; your decompression time effectively maxes out.
This technique, known as saturation diving, is how recovery divers working on the K-141 Kursk were able to spend hours 300 feet below sea level (amidst 10 atmospheres of pressure) and how the crew in The Abyss were able to do their jobs.
Can Humans Breathe Liquid?
 
This stuff also made an appearance in Dan Brown's novel The Lost Symbol.

Liquid Air

Perhaps the best-remembered scene from the 1989 Sci-Fi classic The Abyss is when Ed Harris' character has to don a liquid-filled diving suit in order to descend into the Mariana Trench. He and attempts to breathe what appears to be hot ham water in order to prevent the surrounding pressures from popping his lungs like bloody balloons. Turns out, this scene is closer to science fact than science fiction.
The substance is a perfluorocarbon (PFC), a synthetic liquid fluorinated hydrocarbon—clear, odorless, chemically and biologically inert, with a low surface tension and high O2/CO2 carrying capacity. PFCs can hold as much as three times the oxygen and four times the carbon dioxide as human blood. They also act as very efficient heat exchanges. This makes PFCs ideal for use as a liquid ventilation (LV) medium for medical applications.
Research into liquid ventilation (when you breathe an oxygen-rich liquid instead of air) and PFCs began in earnest immediately following the end of the first World War, when doctors studying treatment of poison gas inhalation began applying saline solutions to test subjects' (in this case, dogs) lungs. PFCs themselves were developed in the early 1940s as part of the Manhattan Project. They were dubbed "Joe’s stuff."
However it wasn't until the 1960s that the field really took off. It was the height of the Cold War and the US military needed a way to increase the escape depth from the numerous submarines it had parked around the globe in the event of a catastrophic systems failure. In 1962, Dr. Johannes A. Kylstra and his team from Duke University showed that mice could be conditioned to breathe an oxygenated saline solution pressurized to 160 atmospheres (or 1 mile below sea level), although they just died a few minutes later from respiratory acidosis (carbon dioxide poisoning). The system was far from perfect, but illustrated that such a technique was indeed possible, albeit not yet plausible.
Subsequent experiments performed by Leland C. Clark, Jr. and Frank Gollan showed that mice could breathe PFCs under normal atmospheric conditions, rats could remain submerged for up to 20 hours, and cats could last weeks. Their study also employed silicone oils as an alternative to PFCs but, as it turns out, silicone oil is really toxic to mammals (but only after returning to breathing normal air). PFCs are currently the only acceptable liquid ventilation medium we know of.
In 1989, human trials began in Philadelphia. Several near-death infants suffering from severe respiratory distress were administered total liquid ventilation—completely filling the lungs with PFC fluid vs filling them to their functional residual capacity—and showed some remarkable physiological improvements, including lung compliance and gas exchange. And that might just be the trick.
During normal development, the fetus' lungs are filled with amniotic fluid and, once they're born, a chemical known as surfactant helps prevent the lungs from collapsing. Premature babies, however, have not yet developed enough surfactant to prevent their lungs from folding in on themselves, so when they're suddenly exposed to a gas atmosphere they struggle just to breathe.
The Philadelphia trials aimed to see if liquid ventilation could accurately recreate conditions within the womb, act as an artificial surfactant, and reduce the neo-natals' stress. While the efforts weren't enough to save lives, the lung performance improvements remained even after removing the ventilator, and proved that liquid ventilation was a potent therapy for premature babies.

The Last Hurdle

Despite its relative success during the Philadelphia trials, total liquid ventilation (TLV) remains very much an experimental procedure. In order to accurately and safely control the volumes of PFC flowing in and out of a patient's lungs, TLV systems require a membrane oxygenator, a heater, and an array of pumps to deliver the PFC—essentially, a dedicated liquid ventilator. Unfortunately, such a device has yet to make it past the prototype stage.
Partial Liquid Ventilation (PLV), on the other hand, only fills up about 40 percent of the patient's lungs with PFC, with the remaining capacity filled by air from a conventional gas ventilator. This means that PLV can be used with existing FDA-approved equipment and can be used to treat acute lung injuries as well as preemies. The PFC helps dislodge debris from alveoli (say, from smoke inhalation), open clogged pathways, and transport oxygen deeper into the lungs while protecting them from collapse and minimizing secondary damage.
But we still haven't overcome the issues that killed Kylstra's mice. The high viscosity of PFC prevents it from cycling through the lungs efficiently enough to exorcise CO2 and prevent respiratory acidosis. You'd have to cycle the fluid at a rate of 5 liters per minute to match a standard resting metabolism, 10 liters a minute for any sort of activity, and the human lungs simply aren't strong enough for such a task.
In other words, The Abyss would have been a bit more accurate if Ed Harris had been carrying a ventilator with him. But even then, he probably wouldn't have made it very long. [Wikipedia - Science Daily - How Stuff Works - National Institute of Health - Kansas University Medical Center - British Journal of Anesthesiologists]
 
Source:

Facebook Releases Report on Government Requests

Facebook on Tuesday for the first time released a report showing the number of requests about its users that it has received from government agencies around the globe.
The report covers the first six months of 2013, ending on June 30, and notes that government groups in 74 countries demanded information about more than 37,954 accounts on Facebook.
Almost half of all the requests came from the United States, the report said.
Facebook did not honor all of the requests from countries around the world. The company said it complied to some extent with 79 percent of the 11,000-12,000 requests it received from American agencies.
Facebook noted that a vast majority of the requests from government agencies relate to criminal cases, including robberies and kidnappings, but that in some instances the requests pertain to national security issues. The report does not break down the numbers of requests on particular subjects.
The requests about Facebook users often seek basic information, the company said, including a person’s name and when that person joined Facebook. But in some cases governments try to find a person’s IP address — which is related to location — or to seek actual content posted on Facebook.

By Nick Bilton
Source and read more:
http://bits.blogs.nytimes.com/2013/08/27/facebook-release-report-on-government-requests/?ref=technology&_r=0

Translations of the Adopted Documents By Article 29 WP

We have published on the website, the translations of the adopted documents
WP 197:
"Opinion 06/2012 on the draft Commission Decision on the measures applicable to the notification of personal data breaches under Directive 2002/58/EC on privacy and electronic communications"

http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/index_en.htm#h2-2

WP 204:
"Explanatory Document on the Processor Binding Corporate Rules"

WP 205:
"Opinion 04/2013 on the Data Protection Impact Assessment Template for Smart Grid and Smart Metering Systems (?DPIA Template?) prepared by Expert Group 2 of the Commission?s Smart Grid Task Force"

and

WP 206:
"Opinion 05/2013 on Smart Borders"

http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/index_en.htm#h2-1

The Next Generation Communications Privacy Act


Abstract:     
In 1986, Congress enacted the Electronic Communications Privacy Act (ECPA) to regulate government access to Internet communications and records. ECPA is widely seen as outdated, and ECPA reform is now on the Congressional agenda. At the same time, existing reform proposals retain the structure of the 1986 Act and merely tinker with a few small aspects of the statute. This Article offers a thought experiment about what might happen if Congress repealed ECPA and enacted a new privacy statute to replace it.

The new statute would look quite different from ECPA because overlooked changes in Internet technology have dramatically altered the assumptions on which the 1986 Act was based. ECPA was designed for a network world with high storage costs and only local network access. Its design reflects the privacy threats of such a network, including high privacy protection for real-time wiretapping, little protection for non-content records, and no attention to particularity or jurisdiction. Today’s Internet reverses all of these assumptions. Storage costs have plummeted, leading to a reality of almost total storage. Even United States-based services now serve a predominantly foreign customer base. A new statute would need to account for these changes.

The Article contends that a next generation privacy act should contain four features. First, it should impose the same requirement on access to all contents. Second, it should impose particularity requirements on the scope of disclosed metadata. Third, it should impose minimization rules on all accessed content. And fourth, it should impose a two-part territoriality regime with a mandatory rule structure for United States-based users and a permissive regime for users located abroad.


By Orin S. Kerr
Source and read the full article:
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2302891

Ganitha

ganitha is a collection of algorithms we have written driven by a need to perform at-scale machine-learning and statistical analysis on hadoop. the reasons we picked scalding and hadoop would be obvious to the most passionate of developers in the scala and hadoop eco-systems…

Source and more details:
http://tresata.com/announcing-our-1st-open-source-project-ganitha/

Threats to Mobile Devices Using the Android Operating System

Source and read more:
http://info.publicintelligence.net/DHS-FBI-AndroidThreats.pdf

BAKANLAR, ‘’ANTEP BAKLAVASI’’ SEVİNCİNE ORTAK OLDU



Aile ve Sosyal Politikalar Bakanı Fatma Şahin ile Kalkınma Bakanı Cevdet Yılmaz, Gaziantep Sanayi Odası (GSO) tarafından baklavanın ‘’Antep Baklavası’’ olarak tescili için yaptığı başvuruyu AB Komisyonu’nun incelemesinin ardından ‘’olumlu’’ bulmasının sevincine ortak oldu.
GSO Yönetim Kurulu Başkanı Adil Konukoğlu’nu ve tescil sürecinde emeği bulunan herkesi tebrik eden Bakan Şahin ve Bakan Yılmaz, Gazianteplileri yerel değerlerine sahip çıkmasından dolayı kutladılar.
Antep baklavasının, Türkiye’ye ve Gaziantep’e ait bir marka olduğunun altyapısını yıllardır oluşturduklarını belirten Aile ve Sosyal Politikalar Bakanı Fatma Şahin, ‘’Daha önce biliyorsunuz, mahallî bir takım ürünlerde bize ait olmasına rağmen başkaları sahiplendiler. Baklavada buna müsaade etmedik. Bunda GSO’nun hem önceki başkanının hem şu anki başkanının çok önemli çalışması oldu’’ dedi.
AB Komisyonu’nun tescil başvurusunu olumlu bulmasının önemini vurgulayan Şahin, ‘’Bu çok önemli. Nasıl bir vizyonel bakış açımız ve her şeyine sahip çıkan bir Gaziantep olduğunu gösteriyor. Bu öngörülebilirliğimizi ve ufkumuzu gösteriyor. Ve markamızın ve baklavamızın kalitesini gösteriyor. Markalı baklavamız hayırlı uğurlu olsun’’ diye konuştu.
Tebrik dileklerini ileten Kalkınma Bakanı Cevdet Yılmaz da ‘’GSO Başkanını ve bütün emeği geçenleri tebrik ediyorum. Tüm Türkiye için önemli bir başarı’’ dedi.


-‘’ALNIMIZIN AKIYLA ÇIKTIK’-
GSO Yönetim Kurulu Başkanı Adil Konukoğlu da açıklamasında, bütün Gaziantepliler olarak ‘’baklavanın Gaziantep’e ait olduğuna ve coğrafi işaret konulması gerektiğine’’ inandıklarını belirtti.
GSO olarak, ‘’Antep baklavası’’ ibaresinin coğrafi işaret olarak tescilini Türk Patent Enstitüsü’nden 2007 yılında aldıklarını ve tescil için 10 Temmuz 2009 tarihinde de AB’ne başvurduklarını anımsatan Konukoğlu, AB Komisyonu’nun incelemesinin ardından başvurunun olumlu bulunduğunu ve Antep baklavasına ilişkin resmi başvuru belgesinin 8 Ağustos 2013 tarihinde AB Resmi Gazetesi’nde yayımlandığını bildirdi.
Konukoğlu, 3 aylık itiraz süreci bulunduğunu ifade ederek, şunları kaydetti:
‘’İtiraz süreci tamamlandıktan sonra inşallah tescil belgemizi sonuçlandıracağız. 4 yıl içinde inanılmaz zorluklar çektik. Defalarca AB’de toplantılara katıldık, defalarca arkadaşlarımız gitti. En sonunda baklavacılarımızdan temsilci götürdük, baklavanın nasıl yapıldığını onlar anlattılar. Oradaki hakem heyetleri her konuyu en ince detayına kadar incelediler. Değişik ülkelerden itirazlar oldu. Fakat sonunda alnımızın akıyla çıktık. 3 aylık süreçte itirazlar gelse de bizim lehimize çıkacağı konusunda güçlü önsezilerimiz var. Bu, Türkiye ve Gaziantep’e hayırlı olsun. Antep baklavası tescili, Türkiye’nin AB’den aldığı ilk coğrafi işarettir. Her zaman sanayicilerimiz ve şehrimiz nasıl ilklere imza atıyorsa, bunda da ilk imzayı attık.’’

Source:
http://www.gso.org.tr/?gsoHaberID=2849

GB3D Type Fossils Online

The International Code on Zoological Nomenclature and the International Code of Nomenclature for algae, fungi and plants require that every species or subspecies of organism, whether living or fossil, should have a type or reference specimen to define its characteristic features.
The GB3D Type Fossils Online project, funded by JISC, aims to develop a single database of the type specimens, held in British collections, of macrofossil species and subspecies found in the UK, including links to photographs (including 'anaglyph' stereo pairs) and a selection of 3D digital models.
The database currently contains only a selection of specimens and is still being populated.

Source and more:
http://www.3d-fossils.ac.uk/home.html

Beware of the giraffes in your data

Marketers and analysts are always on the lookout for exciting new insights which can translate into action items and provide strategic advantage, but they often miss them. They can even make the wrong decisions – because they fail to account for the “giraffe effect” in their data.
Giraffes are what I call portions of data which dominate the rest of the data – and hide important insights. Sometimes they even lead to wrong conclusions. For example a gaming company client looking for the highest value customers thought the data said it should market to men, when women spent twice as much as those with a Y chromosome. How could the data lie?
The truth is, it didn’t. The company was just distracted by a giraffe.

The giraffe, the fox, the cat and the mouse

Let’s say you’re out watching animals in a nature reserve. Undoubtedly, when you spot a majestic giraffe in your binoculars, you’re going to take a good look at him. Meanwhile, many of the other, smaller animals will all just seem, well, small. You won’t notice that there are significant differences in height among the smaller animals, especially as compared to the giraffe.
allgiraffe
However, if you can take your eyes off the giraffe for a minute and zoom your binoculars into the smaller animals on the plain, an amazing thing happens: you become aware that the differences in size between the animals are actually much larger than you had first realized.
smallerthings
This is a very simple example of the giraffe effect. When people look at a set of data which includes some very large, dominant members, important differences among the other data in the set often disappear from view.

By Pini Yakuel
Source and more:
http://gigaom.com/2013/08/24/beware-of-the-giraffes-in-your-data/


Where Teens Seek Online Privacy Advice

Overview


Many teens ages 12-17 report that they usually figure out how to manage content sharing and privacy settings on their own. Focus group interviews with teens suggest that for their day-to-day privacy management, teens are guided through their choices in the app or platform when they sign up, or find answers through their own searching and use of their preferred platform.

At the same time, though, a nationally representative survey of teen internet users shows that, at some point, 70% of them have sought advice from someone else about how to manage their privacy online. When they do seek outside help, teens most often turn to friends, parents or other close family members.


About the Survey


These findings are based on a nationally representative phone survey of 802 parents and their 802 teens ages 12-17. It was conducted between July 26 and September 30, 2012. Interviews were conducted in English and Spanish and on landline and cell phones. The margin of error for the full sample is ± 4.5 percentage points. In collaboration with the Berkman Center for Internet & Society at Harvard, this report also includes insights and quotes gathered through a series of in-person focus group interviews about privacy and digital media, with a focus on social networking sites (in particular Facebook), conducted by the Berkman Center’s Youth and Media Project between February and April 2013. The team conducted 24 focus group interviews with a total of 156 participants across the greater Boston area, Los Angeles, Santa Barbara (California), and Greensboro (North Carolina).

Source and read the full report:
http://www.pewinternet.org/Reports/2013/Where-Teens-Seek-Privacy-Advice.aspx


 

Brute force searching, the typical set and Guesswork


Abstract
 
Consider the situation where a word is chosen probabilistically from a finite list. If an attacker knows the list and can inquire about each word in turn, then selecting the word via the uniform distribution maximizes the attacker’s difficulty, its Guesswork, in identifying the chosen word. It is tempting to use this property in cryptanalysis of computationally secure ciphers by assuming coded words are drawn from a source’s typical set and so, for all intents and purposes, uniformlydistributed within it. By applying recent results on Guesswork, for i.i.d. sources, it is this equipartition ansatz that we investigate here. In particular, we demonstrate that the expected Guesswork for a source conditioned to create words in the typical set grows, with word length, at a lower exponential rate than that of the uniform approximation, suggesting use of the approximation is ill-advised.

Authors: Mark M. Christiansen, Ken R. Duffy, Flavio du Pin Calmon and Muriel Medard

Source and read the full article:
http://arxiv.org/pdf/1301.6356.pdf



 

The Banality of Security

The Curious Case of Surveillance Cameras

Abstract

Why do certain security goods become banal (while others do not)? Under what conditions does banality occur and with what effects? In this paper, we answer these questions by examining the story of closed circuit television cameras (CCTV) in Britain. We consider the lessons to be learned from CCTV’s rapid—but puzzling—transformation from novelty to ubiquity, and what the banal properties of CCTV tell us about the social meanings of surveillance and security. We begin by revisiting and reinterpreting the historical process through which camera surveillance has diffused across the British landscape, focusing on the key developments that encoded CCTV in certain dominant meanings (around its effectiveness, for example) and pulled the cultural rug out from under alternative or oppositional discourses. Drawing upon interviews with those who produce and consume CCTV, we tease out and discuss the family of meanings that can lead one justifiably to describe CCTV as a banal good. We then examine some frontiers of this process and consider whether novel forms of camera surveillance (such as domestic CCTV systems) may press up against the limits of banality in ways that risk unsettling security practices whose social value and utility have come to be taken for granted. In conclusion, we reflect on some wider implications of banal security and its limits.

Authors: Benjamin Goold, Ian Loader and Angélica Thumala

Source and read the full paper:
http://bjc.oxfordjournals.org/content/early/2013/07/22/bjc.azt044.abstract



WebCrypto Key Discovery Working Draft Published

The Web Cryptography Working Group has published a Working Draft of WebCrypto Key Discovery. This specification describes a JavaScript API for discovering named, origin-specific pre-provisioned cryptographic keys for use with the Web Cryptography API. Pre-provisioned keys are keys which have been made available to the user agent by means other than the generation, derivation, importation functions of the Web Cryptography API. Origin-specific keys are keys that are available only to a specified origin. Named keys are identified by a name assumed to be known to the origin in question and provisioned with the key itself. Learn more about the Security Activity.

Source:
http://www.w3.org/

Açık Yönetim Ortaklığı Girişimi

23 Ağustos 2013  CUMA
Resmî Gazete
Sayı : 28744
GENELGE
Başbakanlıktan:
Konu : Açık Yönetim Ortaklığı Girişimi
GENELGE
2013/9
Ülkemizde son yıllarda uygulanmakta olan politikalar sayesinde; eğitim, sağlık, ulaşım, güvenlik, kalkınma, çevre, şehircilik ve yerelleşme başta olmak üzere hemen her alanda Önemli bir dönüşüm süreci yaşanmaktadır. Bu süreçte birçok yasal, kurumsal ve yapısal düzenleme hayata geçirilerek tüm vatandaşlarımızın bireysel hak ve özgürlüklerinin genişletilmesi, Ülkemizin demokrasi, insan hakları ve hukukun üstünlüğü standartlarının yükseltilmesi, kamu yönetiminde etkinliğin, saydamlığın, hesap verilebilirliğin ve katılımcılığın artırılması hususlarında önemli mesafeler kat edilmiştir.
Yukarıda belirtilen alanlardaki gelişmelerin ve siyasi kararlılığının bir göstergesi olarak, bugüne kadar yaklaşık 60 ülke tarafından benimsenen Açık Yönetim Ortaklığı adlı uluslararası girişime Ülkemiz de katılmıştır. Bu girişim kapsamında üye ülkeler; saydamlık ve hesap verilebilirliğin artırılması ile yolsuzlukla etkin bir biçimde mücadele edilmesi, vatandaşların ve sivil toplumun kamusal karar alma ve uygulama süreçlerine daha fazla katılımının sağlanması, açık ve etkin bir kamu yönetimi için teknolojik imkânların daha fazla kullanılması ve vatandaşların devlet karşısında daha güçlü konuma getirilmesi temel prensipleri ışığında ulusal eylem planlarını hazırlamış ve uygulamaya başlamışlardır.
Hükümetimiz tarafından büyük önem verilen Açık Yönetim Ortaklığı Girişimi çerçevesinde Ülkemiz adına şu anda yürütülmekte olan ve bundan sonraki dönemde hazırlanacak tüm plan, program, strateji ve politikaların belirlenmesi, bunların uygulanmasının takip ve gözetimi ile güncel gelişmelere göre gerekli değişikliklerin yapılması, söz konusu süreçlere mümkün olduğunca sivil toplum kuruluşları ile özel sektör temsilcilerinin de katılımının sağlanması, ulusal ve uluslararası platformlarda Ülkemizin temsil edilmesi ve yapılacak çalışmalarda gerekli koordinasyonun sağlanması hususlarında 2009/19 sayılı Başbakanlık Genelgesi ile kurulan "Türkiye'de Saydamlığın Artırılması ve Yolsuzlukla Mücadelenin Güçlendirilmesi Komisyonu Başkanı" yetkilendirilmiştir. Bu kapsamda yürütülecek tüm çalışmalara idari ve teknik destek verilmesi ile alınan kararların kamu kurum ve kuruluşlarınca uygulanmasının takibi Başbakanlık Teftiş Kurulu Başkanlığı tarafından yapılacaktır.
Bu itibarla; Açık Yönetim Ortaklığı Girişimi kapsamında yürütülecek çalışmalar ile verilecek talimatların etkin ve verimli bir şekilde yerine getirilmesini teminen bütün kamu kurum ve kuruluşlarınca gereken her türlü tedbirin alınması, ihtiyaç duyulan destek ve yardımın sağlanmasını önemle rica ederim.
 
 
                                                                                                                                Recep Tayyip ERDOĞAN
                                                                                                                                              Başbakan