ANALYTIC METHODS FOR OPTIMIZING REALTIME CROWDSOURCING


ABSTRACT

Realtime crowdsourcing research has demonstrated that it is possible to recruit paid crowds within seconds by managing a small, fast-reacting worker pool. Realtime crowds enable crowd-powered systems that respond at interactive speeds: for example, cameras, robots and instant opinion polls. So far, these techniques have mainly been proof-of-concept prototypes: research has not yet attempted to understand how they might work at large scale or optimize their cost/performance trade-os. In this paper, we use queueing theory to analyze the retainer model for realtime crowdsourcing, in particular its expected wait time and cost to requesters. We provide an algorithm that allows requesters to minimize their cost subject to performance requirements. We then propose and analyze three techniques to improve performance: push notications, shared retainer pools, and precruitment, which involves recalling retainer workers before a task actually arrives. An experimental validation finds that precruited workers begin a task 500 milliseconds after it is posted, delivering results below the one-second cognitive threshold for an end-user to stay in flow.
Authors:
Michael S. Bernstein, David R. Karger, Robert C. Miller and Joel Brandt

Source:
http://people.csail.mit.edu/msbernst/papers/realtimemodel-ci2012.pdf

Kernels for Vector-Valued Functions: a Review

Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.

Authors:
Mauricio A. Alvarez, Lorenzo Rosasco, Neil D. Lawrence

Source:
http://arxiv.org/abs/1106.6251

10th XBRL Europe Day



2012 December 13, 10th XBRL Europe day Frankfurt, Germany

The agenda of the day which will be set up in detail in the next days and posted on our websites is as follows:

  • 13th of December - Morning 09:00 - 13:00. Plenary session
  • 13th of December- 14:00 - COB.
o Technical Working Groups (Chaired by Derek de Brandt, Thomas Verdin, Hans Buysse)
o Other working groups - Euromarcom chaired by Poul Kjaer followed by Strategy and EU liaison working Committee
Also let us remind that the day before
  • 12th of December afternoon the XBRL Europe EU Business Register Working Group chaired by Thomas Verdin will hold a specific working session
  • 12th of December afternoon an Executive Committee meeting (restricted) chaired by Marc Hemmerling will take place in the end of the afternoon.
 
Source:

Knowing When to Fold 'Em

Caltech engineers and an origami expert are joining forces to build a retinal implant to treat blindness
 
Electrical engineer Azita Emami is an expert in the 21st century technology of analog and digital circuits for computers, sensors, and other applications, so when she came to Caltech in 2007, she never imagined that she would be incorporating in her research an art form that originated centuries ago. But origami—the Japanese art of paper folding—could play a critical role in her project to design an artificial retina, which may one day help thousands of blind and visually impaired people regain their vision.
Retinal implants are designed to bypass the photoreceptors in the retina that have been damaged by diseases such as retinitis pigmentosa (RP) and age-related macular degeneration (AMD). About four years before Emami arrived at the Institute, Caltech investigators began working on a retinal implant through USC's Biomimetic Microelectronic Systems–Engineering Research Center, funded by the National Science Foundation (NSF). The basic idea is to use a miniature camera mounted on a pair of eyeglasses to capture images, then process the images and send the digital information wirelessly to an implantable microchip. The microchip generates electrical currents for stimulation, and a tiny cable carries the currents to an electrode array attached to the patient's retina. The electrodes stimulate cells in the eye, which transmit signals through the optic nerve to the part of the brain that creates a picture.
The center's director, Mark Humayun, an ophthalmologist at USC's Doheny Eye Institute and a pioneer in artificial-retina surgery, has implanted such a device in several completely blind patients suffering from end-stage RP, restoring some of their vision. The 60-electrode array allows these patients to see light as well as low-resolution representations of objects and enlarged letters.
Hundreds of thousands of people who suffer from AMD, however, are able to see at least that much on their own and thus would derive no benefit from the array. To create an artificial retina that could help these people, Humayun needed a better chip and an array that had more electrodes to stimulate more cells in the eye. At the suggestion of Caltech professor of electrical engineering and mechanical engineering Yu-Chong Tai, who had worked with Humayun on packaging and integration of the retinal implants, Emami, an assistant professor of electrical engineering and an expert in building ultralow-power circuits, joined the team to focus on the next generation retinal implants.
Emami's lab recently developed just such a chip, which supports 512 electrodes and is extendable to 1024 electrodes if two chips are used. The chip has wireless capabilities for power and data telemetry and can fit inside the eyeball, eliminating the need for the infection-prone cable used in the earlier system. The design also features many novel techniques for reducing size and power. Reducing the power consumption is critical for wireless power delivery and to avoid tissue damage due to the heat generated by the chip. Humayun will soon test the chip on subjects to see exactly how much of their vision is restored.
But even that electrode-rich array won't solve two of the biggest challenges of the technology: creating a device that requires only a minimally invasive incision to implant, and one that also conforms to the shape of the eye. The original electrode array was mounted on a relatively flat substrate that required a large surgical incision for implantation. It could only be tacked onto one spot on the retina to avoid damaging the neurons—which meant that it pulled away at the loose end. And that also meant that some of the electrodes would be completely ineffective while others needed a greater current from the chip to properly stimulate retinal cells, leading to high power consumption.
Emami, Humayun, and Tai realized that a flexible substrate that could be folded up, origami-style, before implantation and then opened up to a curved shape once inside would need only a minimally invasive incision to be slid into place. Instead of one large chip, many smaller chips distributed over the substrate and between the folds would remove the need for the cable and lead to better reliability and lower cost, Emami says. With a system that conformed to the curve of the eye, the location of the chips and the electrodes could be optimized through the design of the origami structure, precisely matching the parts of the eye to be stimulated.
To create such a design, Emami recruited Caltech alum Robert Lang (BS '82, PhD '86), one of the world's leading origami experts. Lang, who has practiced origami for more than 40 years, is known for developing mathematical equations to enable the construction of highly complex origami designs. Over the summer, Emami received an NSF grant to build the first prototype of an origami implant that will fit inside the eye and match the contour of the retina.
"I'm used to working with paper that starts out as no smaller than two inches square," Lang says. This new creation, however, will be less than one quarter that size, will be made out of plastic, and will have to deploy perfectly after surgical implantation.
Assisting Lang in the design is Sergio Pellegrino, the Joyce and Kent Kresa Professor of Aeronautics and professor of civil engineering and a senior research scientist at JPL. Pellegrino is an expert at developing origami-like structures, but on a giant scale: he devises lightweight expandable structures for use on spacecraft—such as foldable booms that serve as antenna and deployable masts.
The ability to translate these sorts of very large designs to something that can be unobtrusively inserted and then unfolded in the eye "is a matter of scaling, and that's an engineering principle. It is what engineers do," says Ares Rosakis, chair of the division of engineering and applied science. "The difference is that at Caltech we also invent and scale our own inventions: we invent something for X and we use it for Y. So someone like Pellegrino can invent something for space and then have fantastic successes by scaling it for use in medical engineering."
While Pellegrino and Lang work on the origami, Emami will continue working on the chip. By the end of next year they hope to show in animal models that an origami substrate can be inserted inside the eye, unfolded, and held in place by either retinal tacks or a less invasive method, also using origami. Soon after, they hope to have a foldable artificial retina that can be tested on a patient.
Once perfected, Emami thinks that the new retinal implant technology could be applied to other medical applications, such as neural implants that are being developed to help paralyzed people regain movement. "Our origami approach is fundamentally different and can lead to a new area in engineering with a great impact for neuroscience and biomedical devices," Emami says. "We may be able to benefit many people."

Source:
http://www.caltech.edu/content/knowing-when-fold-em

 

NIST Publishes Methods to Manage Risk in the Federal ICT Supply Chain

The National Institute of Standards and Technology (NIST) has published the final version of Notional Supply Chain Risk Management Practices for Federal Information Systems. This guide offers an array of supply chain assurance methods to help federal agencies manage the risks associated with purchasing and implementing information and communications technologies (ICT) products and services.
Security risks introduced via the supply chain—both intentional and unintentional—are substantial and on the rise. The global ICT supply chain's growing sophistication and increasing speed and scale leave government agencies vulnerable to be exploited through a variety of means, including counterfeit materials, malicious software or untrustworthy products.
The guide describes ICT supply chain risk management as a multidisciplinary practice with a number of interconnected enterprise processes that, when performed correctly, will help departments and agencies manage the risk of using ICT products and services. The publication calls for procurement organizations to establish a coordinated team approach to assess the ICT supply chain risk and to manage this risk by using technical and programmatic mitigation techniques.
The new guide is based on information technology security practices and procedures published by NIST, the National Defense University, the National Defense Industrial Association and others. These practices were expanded to include supply chain implications. This version of Notional Supply Chain Risk Management Practices for Federal Information Systems has been through two public review periods, allowing for input from a broad array of stakeholders. The final publication differs from previous drafts in that it provides a more specific definition of the supply chain threat and further details on the roles of integrator and supplier and how they apply to the federal government's acquisition of commercial off-the-shelf products.
NIST is developing a draft Special Publication based on the proceedings of the Oct. 15-16, 2012, Supply Chain Risk Management Workshop and ongoing discussions with industry, academic and government stakeholders. PowerPoint presentations from that workshop are available at http://www.nist.gov/itl/csd/scrm_2012workshop.cfm. NIST will continue to engage public- and private-sector stakeholders throughout the publication development process.

Notional Supply Chain Risk Management Practices for Federal Information Systems (NIST IR 7622) is available at http://nvlpubs.nist.gov/nistpubs/ir/2012/NIST.IR.7622.pdf.

Source:
http://www.nist.gov/itl/csd/supply-112712.cfm

IAB Releases Site Tagging Best Practices for Public Comment, to Alleviate Industry-Wide Complexities

Like many essential tools in the interactive advertising industry, tagging has matured and evolved into a complex ecosystem that provides publishers and advertisers with massive benefits and insights – but also poses a series of technical and operational challenges. “Site Tagging Best Practices,” released today for public comment by the Interactive Advertising Bureau (IAB), meets these issues head on. Developed by the IAB’s Data Council, through its Site Tagging Best Practices Task Force, the best practices offer detailed recommendations for site tagging to establish healthy workflow procedures and to ensure security in page performance and data control, as well as privacy compliance and consumer security concerns.
“Tagging is a fundamental element of the heavily data-driven interactive advertising ecosystem without which it would not continue to thrive,” said Steve Sullivan, Vice President, Advertising Technology, IAB. “To better meet the needs of publishers, advertisers, marketers and consumers, we must not only take account of the ongoing value of site tagging, but more fully understand the operational challenges presented by site tagging’s increasingly prolific use.”
“Site Tagging Best Practices” identifies areas of potential value loss for site owners, including:
  • User abandonment
  • Negative customer experience
  • Performance impact
  • Operational strain
  • Unintended transfer of data
  • Privacy issues
To help mitigate these risks, the best practices aim to educate and equip the industry with a new reference for addressing these common challenges in site tagging, as it relates to site performance and data management. It covers:
  • Workflow planning, implementation and maintenance
  • Performance
  • Data capture and transfer
  • Privacy
In addition, the IAB “Site Tagging Best Practices” contains a lexicon of marketplace terms to help stakeholders better understand and navigate the vernacular of the current tagging ecosystem.
“With site tagging laying the foundation for selling ad space and tracking performance across the interactive landscape, these best practices will set a solid framework for moving the industry forward,” said Mitchell Weinstein, Senior Vice President, Director of Ad Operations, UM. “The value that tagging brings to marketers is enormous, but it can only reach its full potential with practices like this in place to address critical concerns and allow for effective tagging to flourish.”
“Without question, site tagging has helped to revolutionize digital advertising from a system of basic ad serving to an extremely sophisticated and complex marketing system – one that is of immense importance to our industry,” said Todd Chu, Senior Vice President, Partner Development, BrightTag, and Co-chair of the IAB Site Tagging Task Force. “By adhering to fundamental best practices around data capture and transfer, all digital marketing stakeholders will benefit along with the customers interacting with their sites.”
“These best practices provide incredible value to a variety of stakeholders in the industry, beyond online publishers and advertisers,” said Maggie Neuwald, Vice President, Solutions Marketing, TagMan, and Co-chair of the IAB Site Tagging Task Force. “Its impact will be felt by brand marketers, analytics leaders, IT professionals, legal and policy executives and agency heads. In addition, beyond site tagging policy and operations, this report will also serve as a go-to checklist for technology practitioners in product and engineering.”
IAB member companies which contributed to “Site Tagging Best Practices” include AMC Networks, BlueKai, BrightTag, Catchpoint, Evidon, Google & YouTube, Krux Digital, Microsoft-Atlas Advertiser, Mirror Image Internet, TagMan and Yahoo!.
The public comment period is open until January 4, 2013. Please submit any feedback to Jessica Anderson, lead on Site Tagging for IAB, at jessica.anderson@iab.net prior to the deadline.
For a full copy of “Site Tagging Best Practices,” please visit www.iab.net/sitetagging.
About the IAB
The Interactive Advertising Bureau (IAB) is comprised of more than 500 leading media and technology companies that are responsible for selling 86% of online advertising in the United States. On behalf of its members, the IAB is dedicated to the growth of the interactive advertising marketplace, of interactive’s share of total marketing spend, and of its members’ share of total marketing spend. The IAB educates marketers, agencies, media companies and the wider business community about the value of interactive advertising. Working with its member companies, the IAB evaluates and recommends standards and practices and fields critical research on interactive advertising. Founded in 1996, the IAB is headquartered in New York City with a Public Policy office in Washington, D.C. For more information, please visit www.iab.net.
 
Source:

Wide Open Privacy: Strategies For The Digital Life





Wide Open Privacy is a guide to protecting your digital identity and personal brand in the digital age. It presents a new philosophy that leverages the power of the Internet for maximum benefit while minimizing your exposure to privacy infractions.

November 2, 2012
 
Source:

The Network Information API

W3C Working Draft 29 November 2012

This version:
http://www.w3.org/TR/2012/WD-netinfo-api-20121129/
Latest published version:
http://www.w3.org/TR/netinfo-api/
Latest editor's draft:
http://dvcs.w3.org/hg/dap/raw-file/tip/network-api/Overview.html
Previous version:
http://www.w3.org/TR/2011/WD-netinfo-api-20110607/
Editor:
Mounir Lamouri, Mozilla

Abstract

The Network Information API provides an interface for web applications to access the underlying connection information of the device.

Status of This Document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.
The functionality described in this specification was initially specified as part of the System Information API but has been extracted in order to be more readily available, more straightforward to implement, and in order to produce a specification that could be implemented on its own merits without interference with other, often unrelated, features.
This document was published by the Device APIs and Policy Working Group as a Working Draft. This document is intended to become a W3C Recommendation. If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives). All feedback is welcome.
Publication as a Working Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the 5 February 2004 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

Table of Contents

 
Source:

Medicare Is Faulted on Shift to Electronic Records

The use of electronic medical records has been central to the aim of overhauling health care in America. Advocates contend that electronic records systems will improve patient care and lower costs through better coordination of medical services. The privacy protection of those records has been a primary concern.
But as Reed Abelson of The New York Times reports, the conversion to electronic medical records is “vulnerable” to fraud and abuse because of the failure of Medicare officials to develop appropriate safeguards, according to a sharply critical report to be issued on Thursday by federal investigators.
The report says Medicare, which is charged with managing the incentive program that encourages the adoption of electronic records, has failed to put in place adequate safeguards to ensure that information being provided by hospitals and doctors about their electronic records systems is accurate. To qualify for the incentive payments, doctors and hospitals must demonstrate that the systems lead to better patient care, meeting a so-called meaningful use standard by, for example, checking for harmful drug interactions.
Although there is little disagreement over the potential benefits of electronic records in reducing duplicative tests and avoiding medical errors, critics increasingly argue that the federal government has not devoted enough time or resources to making certain the money it is investing is being well spent.

Source:
http://bits.blogs.nytimes.com/2012/11/29/daily-report-medicare-is-faulted-on-shift-to-electronic-records/
 

Senate Committee Approves Bill Requiring Warrants for E-Mail

A Senate committee on Thursday unanimously backed sweeping digital privacy protections requiring the government, for the first time, to get a probable-cause warrant to obtain e-mail and other content stored in the cloud.

The measure, sponsored by Sen. Patrick Leahy (D-Vermont), the head of the Senate Judiciary Committee, amends the 1986 Electronic Communications Privacy Act. The amendment would nullify a provision that allows the government to acquire a suspect’s e-mail or other stored content from an internet service provider without showing probable cause that a crime was committed.

The development comes as e-mail privacy is again in the spotlight after FBI investigators uncovered an affair between then-CIA chief David Petraeus and his biographer Paula Broadwell after gaining access to e-mail accounts used by Broadwell.

Currently, the government can obtain e-mail without a warrant as long as the content has been stored on a third-party server for 180 days or more, and only needs to show, often via an administrative subpoena, that it has “reasonable grounds to believe” the information would be useful in an investigation.

The measure and its minor amendments still face tough fights before the full Senate and in the House, and it likely will be revised to comport with concerns from the Justice Department and other lawmakers that the bill is soft on crime. The measure is not expected to reach the Senate floor until sometime next year.

“There will still be work to be done on this,” Leahy said when the 50-minute Judiciary Committee hearing ended Thursday.

Leahy noted that the measure leaves intact “federal counterterrorism” provisions such as the Patriot Act, which gives the FBI power to acquire phone, banking and other records using a so-called “national security letter” without court warrants.

Sen. Chuck Grassley (R-Iowa) agreed that the ECPA amendment likely will be revised. “I think it’s a start of an important discussion,” he said.

When enacted, ECPA provided privacy to users, but that privacy protection became eroded as technology advanced and people began storing e-mail on servers for longer periods, sometimes indefinitely. The act was adopted at a time when e-mail wasn’t stored on servers for a long time, but instead was held briefly on its way to the recipient’s inbox. E-mail more than 6 months old was assumed abandoned.

Despite the reform’s uncertain future, the American Civil Liberties Union applauded the committee’s Thursday action.

“This is an important gain for privacy. We are very happy that the committee voted that all electronic content like emails, photos and other communications held by companies like Google and Facebook should be protected with a search warrant,” said Chris Calabrese, the ACLU’s legislative counsel. “We believe law enforcement should use the same standard to search your inbox that they do to search your home.”

In a bid to get his measure through, Leahy attached his privacy measure to a package that amends the Video Privacy Protection Act and which has broad support from lawmakers.

The video act outlaws the disclosure of video rentals unless the consumer gives consent, on a rental-by-rental basis. Congress adopted the measure in 1988 after failed Supreme Court nominee Robert Bork’s video rental history was published by the Washington City Paper during confirmation hearings.

But the result of the act is that it prohibits Netflix customers from allowing their Facebook streams to automatically update with information about the movies they are watching, though Spotify and other online music-streaming customers can consent to the automatic publication on Facebook of the songs they’re listening to.

The Judiciary Committee amendment allows Facebook users to have their timelines automatically updated with whatever they’re watching on Netflix, allowing consumers to opt-in for two-year periods.

“The consumer should have the ability to opt in to that,” said Sen. Dianne Feinstein (D-California.)

The House passed similar Netflix legislation last year.
 
Source:

INTERNET IMPROVEMENT BOARD MEETING



Internet Improvement Board of Ministry of Transport, Maritime Affairs and Communication combined today its regular meeting with Safer Internet's 1st Anniversary events and hosted local and international ICT sector representators at Esma Sultan Yalısı, İstanbul.


President of the Internet Improvement Board, Serhat Özeren and President of Turkish ICTA, Tayfun Acarer made opening speechs. Europe and Middle-East Director of FOSI, David Miles; from eNASCO John Carr; from Klicksafe Dilek Atalay and from Vodafone Burak Özcü shared their experience on the protection of the kids on the Internet. Within the Internet Improvement Board's meeting all participants discussed several issues relating safer internet, better regulation and internet governance.


 
 
 





Challenging Security Requirements for US Government Cloud Computing Adoption

The Federal Cloud Strategy, February 8, 2010, outlines a federal cloud computing program that identifies program objectives aimed at accelerating the adoption of cloud computing across the federal government. NIST, along with other agencies, was tasked with a key role and specific activities in support of that effort, including the delivery of the NIST Cloud Computing Technology Roadmap and the publication of other Special Publications that address the reference architecture, definitions, and security aspects of cloud computing. In order to achieve adoption of cloud computing for the federal government, it is necessary to address the security and privacy concerns that federal agencies have when migrating their services to a cloud environment. To further exacerbate the situation, there are few documented details that directly address how to achieve some security aspects in a cloud environment. The purpose of this document is to provide an overview of the high-priority security and privacy challenges perceived by federal agencies as impediments to the adoption of cloud computing. The document provides descriptions of the existing mitigations to these security and privacy impediments. If no mitigations are listed, then ongoing efforts that could lead to mitigations are described. In the cases where no ongoing efforts were identified, the document makes recommendations for possible mitigation or references existing best practices.

Author: Michaela Iorga
Published: November 27, 2012
PDF version:PDF Document Click here to retrieve PDF version of paper (1MB)
Source:
http://www.nist.gov/manuscript-publication-search.cfm?pub_id=912695

Cyber security according to Winnie the Pooh

Proactive detection of security incidents II - "Honeypots"

An increasing number of complex attacks demand improved early warning detection capabilities for CERTs. By having threat intelligence collected without any impact on production infrastructure, CERTs can better defend their constituencies assets. Honeypots are powerful tools that can be used to achieve this goal. This document is the final report of the ‘Proactive Detection of Security Incidents: Honeypots’ study.
Nov 22, 2012
Downloads ENISA_Honeypots_study.pdf — PDF document, 5,247 kB (5,373,589 bytes)
English

Source:
http://www.enisa.europa.eu/activities/cert/support/proactive-detection-of-security-incidents-II-honeypots           

Record Online Sales Mean the End of ‘E-Commerce’

More than ever, we shop online. More than ever, we use smartphones and tablets to shop online. In a flurry of data dumps throughout the Thanksgiving-Black Friday-Cyber Monday shopping extravaganza, e-commerce analysts have used these two would-be headlines to frame the start of the online shopping season as cause for celebration among online retailers.

And they have a point: On Black Friday, e-commerce sales topped $1 billion for the first time, according to comScore — a 26 percent increase over last year. And shoppers did use mobile devices to make more of those purchases than ever before. Overall, nearly a third of all online shopping this Black Friday occurred via smartphone or tablet, up from less than one-fifth last year, according to Scot Wingo, CEO of e-commerce software maker ChannelAdvisor.

But there’s one important figure that puts a damper on the idea that this weekend’s retail results signify a revolution. The National Retail Federation says its survey results show spending over the Black Friday weekend topped $59 billion. Divided across four days, that’s nearly $15 billion per day. Even if a full $1 billion of those sales each day took place online, that’s still less than 7 percent of total spending. This is consistent with e-commerce’s share of consumer spending any other time of year. All of which leads to a less-than-revolutionary dog-bites-man headline: People still shop in stores. A lot.

Not that 7 percent is inconsequential. For an industry that didn’t exist 20 years ago, gobbling up a 7 percent chunk of U.S. retail sales is massive, especially since a huge percentage of those sales are going to Amazon and eBay. Those are sales that traditional retailers are losing to e-commerce; so far, their own websites and apps aren’t making up the difference.

But the fact that a huge majority of U.S. consumers still choose to go to the store should also quash any triumphalist march-of-progress rhetoric that claims an ever-growing cannibalization of traditional retail by the internet. Online spending is growing, but maybe it has an upper limit. Maybe online shopping can’t duplicate the in-store experience.

The smartest stores recognize that they can’t go all-in on either online or offline shopping, and moreover shouldn’t bother to maintain the fiction that there’s any clear line dividing the two. In an interview with WIRED last week, Walmart.com CEO Joel Anderson said that starting next year its individual stores will start to get credit for digital sales as well as sales made across its checkout counters.

This makes sense especially for Walmart, which has actively worked to blur the online/offline distinction. Online shoppers can pick up their Walmart.com orders at the store. They can go to the store and pay for their online order with cash. Increasingly, Walmart is using its 4,000 U.S. stores as de facto warehouses from which they ship items ordered online. And, the company says, about 12 percent of online orders made using its mobile app take place while the customer is physically in a Walmart store. In that light, the online/offline distinction starts to seem awfully arbitrary, much like the idea of Cyber Monday itself.

“I think the first phase of the e-commerce world was clearly about pure play — buy online, ship it to your house,” Anderson says. “I think it’s hard today to define what an e-commerce transaction is. If you buy online and pick it up in the store, is that an e-commerce transaction or an offline transaction?”

The answer, he says, is not to worry about semantics. Instead, Anderson says worry about ubiquity. “We have channels,” he says, “but strategically … we (just) need to be where our customers want us to be.”
 
Source:

Music and IT: X-Treme Audio-"eXplosive Sound"

Company Profile

X-Treme was founded in 2001 and is a brand of products created by the Strategic Business Unit of the Sound Corporation group, whose objective is to produce“concert, touring and portable sound systems”. In other words, professional audio systems dedicated to concerts, live open-air events or any other installation in a closed environment where live music is played.
This group’s top management directly controls well-known brands such as Peecker sound (a leader in the “fixed installations and club sound” sector for over 30 years, with more than two thousand sound reinforcement installations and an international distribution network) and XTE (which deals mainly with "commercial sound").
Maintaining distinct independence in not only management and personnel, but also the strategic aims of the business unit, the X-Treme division, made up of an authentic enterprise with various working activities (from R&D to Production, Quality Control, etc…) can make use of both the tangible resources (i.e. joinery, the electronic department and the electro acoustic department) and the intangible resources (i.e. the know how of the engineers and technicians from the afore-mentioned companies, who are the current members of staff) of a group which has been operating successfully in the competitive professional audio sector since 1968.
 
Electronics
 
XTDT Amplifiers
The new Digital Technology Series amplifiers with class D final stage are available in four models with a choice of powers and numbers of channels. The XTDT3200 and XTDT3800 models have two channels and supply power up to 1600 W and 1900 W, respectively, per channel on 4Ω. The XTDT4800F and XTDT6000F models, on the other ... [Learn more]
XTDP Controllers
The X-Treme XTDP24 and XTDP26 digital processors (which have 2 inputs/4 outputs and 2 inputs/6 outputs, respectively) are professional management systems for speakers and, as such, can be used for stereo configurations up to 2/3 ways or mono up to 4/6 ways. Each input has a 6 band parametric equalizer, a high-pass filter and two ... [Learn more]
Source and for all other details:
 

Pazar Analizi Yönetmeliği

27 Kasım 2012 SALI
Resmî Gazete
Sayı : 28480
YÖNETMELİK
Bilgi Teknolojileri ve İletişim Kurumundan:
PAZAR ANALİZİ YÖNETMELİĞİ
BİRİNCİ BÖLÜM
Amaç, Kapsam, Dayanak, Tanımlar ve İlkeler
Amaç ve kapsam
MADDE 1 (1) Bu Yönetmeliğin amacı; elektronik haberleşme sektöründe etkin rekabet ortamının tesisi, korunması ve geliştirilmesini teminen gerçekleştirilen pazar analizlerine ve etkin piyasa gücüne sahip işletmecilere getirilebilecek yükümlülüklere ilişkin usul ve esasları düzenlemektir.
Dayanak
MADDE 2 (1) Bu Yönetmelik 5/11/2008 tarihli ve 5809 sayılı Elektronik Haberleşme Kanununun 6 ncı ve 7 nci maddelerine dayanılarak hazırlanmıştır.
 
Source:
 

June 2012 Web Privacy Census

Public policymakers are proposing measures to give consumers more privacy rights online. These measures are based upon the assumption that the web privacy landscape has become worse for consumers; that their online activities are tracked more pervasively now than they were in the past. This assumption may be true, as online advertising and metrics companies have developed more sophisticated ways to track and identify individuals online. This has been substantiated in the academic literature, and in the popular press through an influential news series, “What they Know,” by Wall Street Journal reporters.
As policymakers consider different approaches for addressing internet privacy, it is critical to understand how interventions such as negative press attention, self-regulation, Federal Trade Commission enforcement actions, and direct regulation affect tracking. As early as 1995, Beth Givens of the Privacy Rights Clearinghouse suggested that federal agencies create benchmarks for online privacy. The first attempts of web measurement, discussed in our literature review, found relatively little tracking online in 1997--only 23 of the most popular websites were using cookies on their homepages. But within a few years, tracking for network advertising was present on many websites, and by 2011, all of the most popular websites employed cookies.
The Web Privacy Census is intended to formalize the benchmarking process and measure internet tracking consistently over time. We seek to explore:
  • How many entities are tracking users online?
  • What vectors (technologies) are most popular for tracking users?
  • Is there displacement (i.e. a shift from one tracking technology to another) in tracking practices?
  • Is there greater concentration of tracking companies online?
  • What entities have the greatest potential for online tracking and why?
This effort was developed and executed in partnership with Abine, Inc. Abine has been our technical collaborator and resource partner, helping us develop a reliable method for web crawling and analysis of tracking vectors.
In this report, we discuss the results of a crawl conducted on 5/17/12. We found cookies on all popular websites (by “popular websites,” we mean the top 100 most popular according to Quantcast). We conduct two different crawls—a shallow one where our test browser just visits the homepage of a site, and a deep crawl where our browser visits six links on a site. Our shallow crawl of the 25,000 most popular sites revealed that 87% have cookies (24% first, 76% third), 9% had HTML5 storage objects, and less than .0001% had flash cookies. Twenty-five percent of cookies include names such as “UID” and “GUID”, suggesting that they are used for uniquely identifying users. Overall, we found that flash cookie usage is dropping and HTML5 storage use is rising and at least one tracker is using HTML5 local storage to hold unique identifiers from third party cookies.

For the Full Report:
http://www.law.berkeley.edu/14496.htm

2013 Honeynet Project Workshop 10-12 Feb 2013 | Dubai

The 2013 Honeynet Project Security Workshop brings together experts in the field of information security from around the world to share the latest advances and threats in information security research.

Organized by the The Honeynet Project this three-day workshop features a rare, outstanding line-up of international security professionals who will present on the latest research tools and findings in the field. This year's workshop will be held at the wonderful 5-star luxury hotel The Address Dubai Mall in Dubai UAE on 10-12 February 2013. The workshop includes one-day of briefings and two days of hands-on tutorial trainings.
 Presentation topics cover the latest honeynet/honeypot technology, android security and social network security from The Honeynet Project and Facebook. This year, we also offer several security training courses. If you're looking to attend a high quality and challenging workshop and to learn the practical security skills, then we encourage you to take advantage of this rare opportunity.

About The Honeynet Project

The Honeynet Project is a leading international 501c3 non-profit security research organization, dedicated to investigating the latest attacks and developing open source security tools to improve Internet security. With Chapters around the world, our volunteers have contributed to fight again malware (such as Confickr), discovering new attacks and creating security tools used by businesses and government agencies all over the world. The organization continues to be on the cutting edge of security research by working to analyze the latest attacks and educating the public about threats to information systems across the world.
Founded in 1999, The Honeynet Project has contributed to fight against malware and malicious hacking attacks and has the leading security professional among members and alumni. Our mission reads "to learn the tools, tactics and motives involved in computer and network attacks, and share the lessons learned" with three main pillars:
Research
The Honeynet Project volunteers collaborate on security research efforts covering data analysis approaches, unique security tool development and gathering data about attackers and malicious software they use. We provide critical additional information, such as their motives in attacking, how they communicate, when they attack systems and their actions after compromising a system. We provide this service through our Know Your Enemy whitepapers, The Project blog posts and our Scan of the Month challenges.
Awareness
The Honeynet Project members engage the broader security community and educate the public about threats to systems and information. We raise awareness of the threats and vulnerabilities that exist on the Internet today. We provide this information so people can better understand they are a target, and understand the basic measures they can take to mitigate these threats as well as better handle advanced threats that slip through the defenses. This information is provided through our Know Your Enemy series of papers as well as The Honeynet Project blog and other media venues and public security workshops.
Tools
The Honeynet Project engages broader security community via Google Summer of Code (GSoC) and other efforts to expand security tool development. For organizations interested in continuing their own research about cyber threats, we provide the tools and techniques we have developed. Recent tools examples include Cuckoo, Capture-HPC, Glastopf, HoneyC, Honeyd, Honeywall. We provide these through our Tools Site. Key tools are also described in Know Your Tools papers and on The Project blog.
Vision
Our vision for the Honeynet Project reads as follows:
The Honeynet Project is a diverse, talented, and engaged group of international computer security experts who conduct open, cross disciplinary research and development into the evolving threat landscape. It cooperates with like-minded people and organizations in that endeavor.

Short Video Explaning Honeypots (Quicktime, 45MB)

Source:
http://www.honeynet.org/
 

Judge Scheindlin joins The Sedona Conference​® eDiscovery COOPERATIO​N TRAINING program faculty!



Judge Shira A. Scheindlin, who taught us the fundamental rules of modern e-Discovery with her judgements on Zubulake vs. UBS Warburg case.




Judge Shira A. Scheindlin Joins the Faculty of
TSC's First-Ever e-Discovery Cooperation Training Program, Thursday-Friday, February 21-22, 2013
We are enormously pleased to announce that Hon. Shira A. Scheindlin will be joining the faculty of The Sedona Conference®'s first-ever e-Discovery Cooperation Training Program being held Thursday-Friday, February 21-22, 2013 in Phoenix.
Judge Scheindlin is the author of the famous Zubulake series of opinions – considered the foundation of modern e-discovery, served on the Civil Rules Advisory Committee of the Judicial Conference of the United States when the 2006 amendments to the Federal Rules of Civil Procedure were being drafted, and is co-author of the West law school casebook, Electronic Discovery and Digital Evidence, now in its second edition.



"No sophisticated lawyer can responsibly conduct a litigation in today's world without understanding that absent real cooperation among counsel, a case will not be efficiently or effectively handled," Judge Scheindlin said when accepting our invitation to join the Cooperation Training program faculty.
NOTE: This program will NOT be videotaped!
The initial publicity for this program indicated that we would depart from the usual "Sedona Rules" and videotape the proceedings for use in future judicial and continuing legal education programs. We are no longer planning to videotape this program due to early feedback. The program will be conducted consistent with "Sedona Rules," as always.
Co-chairing the program are Kenneth J. Withers, Director of Judicial Education for The Sedona Conference®, and Hon. Judge Ralph Artigliere (retired) of Florida. The faculty also includes Craig D. Ball, Craig D. Ball P.C.; William P. Butterfield, Hausfeld LLP; Hon. John M. Facciola, U.S. Magistrate Judge, Washington, DC; Prof. Steven S. Gensler, University of Oklahoma College of Law; Jennifer L. Hamilton, Deere & Company; Sherry B. Harris, Crowley Law Office; Dawson Horn III, Tyco; John H. Jessen, Jessen & Associates; Hon. Shira A. Scheindlin, U.S. District Court for the Southern District of New York, New York, NY; Jeffrey C. Sharer, Sidley Austin LLP; and Steve Susman, Susman Godfrey L.L.P.
We are pleased to invite you to apply to participate in this one-of-a-kind program.
  • Cost: $1195.00. Complete program details, faculty bios, and the Application Form are located at https://thesedonaconference.org/node/4201
  • Registration is limited to 50; apply immediately. We will be accepting applications on a rolling basis in a manner that will ensure balance among the participants.
  • You will receive a Certificate of Completion for the training program at its conclusion.
  • We will apply in advance for 11 HOURS of MCLE CREDIT, including 1 hour of ETHICS credit.
Here's how the training program works:
  • Using real world litigation hypotheticals, judges and litigators will work through stages of the pre-trial process together.
  • Scenarios and mock exercises will explore practical cooperation strategies to avoid or resolve conflicts that commonly occur at various stages of litigation: preservation, collection, production, and use of ESI; client meetings; negotiations with opposing counsel; conferences with the judge; and motion argument.
  • The faculty of veteran trial lawyers and judges will both participate in the scenarios and provide valuable critique of the registrants' participation.
Texts will include: The Sedona Conference®Cooperation Proclamation; The Sedona Conference®Cooperation Guidance for Litigators & In-House Counsel, The Case for Cooperation, The Cooperation Bullseye, and The Sedona Conference®Cooperation Proclamation: Resources for the Judiciary.
Members of The Sedona Conference®Working Group Series are eligible for a $100 discount, and full-time government employees are eligible for a $250 discount. Discounts, however, cannot be combined. If your application is accepted and you qualify for a discount, please contact our office at info@sedonaconference.org to receive a discount code to use when registering online.

For detailed info on The Sedone Conference see:
https://thesedonaconference.org/

For Ethics Cartoons see:
http://www.stus.com/stus-category.php?cat=NOW&sub=CET

Privacy considerations of online behavioural tracking

Internet users are being increasingly tracked and profiled and their personal data are extensively used as currency in exchange for services. It is important that this new reality is better understood by all stakeholders if we are to be able to support and respect the right for privacy.
Nov 14, 2012
Downloads Privacy considerations of online behavioural tracking.pdf — PDF document, 657 kB (673,356 bytes)
English


Source:
http://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/privacy-considerations-of-online-behavioural-tracking          

Song of The Lonely Mountain




Peter Jackson's The Hobbit: An Unexpected Journey is fast approaching, and the epic adventure movie gets an appropriately epic ballad with Neil Finn's "Song of the Lonely Mountain." With angelic vocals, swooning strings and a slow, determined build, the tune captures the feel of an arduous trek across a rocky, perilous landscape. Metal crashes invoke blacksmiths hard at work forging weapons in the fire while the classical guitar floats between the chants and clattering percussion.



"'The Song of the Lonely Mountain' was developed from a dark and mysterious theme which the dwarves sing early in the movie," Finn tells Rolling Stone. He says Jackson and his team suggested he get in a "dwarven state of mind" as he shaped the song to play over the end credits. "After some days of mining underground (actually, in Peters office) I emerged with the song, then set about recording it with my sons Elroy and Liam. Dave Fridmann came in at the end with a bold mix. He seemed to respond well to my demands for 'more anvil!' Pop music needs more anvil!"




The soundtrack for The Hobbit: An Unexpected Journey will be out on December 11th. The film, part one in a trilogy, opens December 14th.


Click to listen to Neil Finn's "Song of The Lonely Mountain":

http://www1.rollingstone.com/hearitnow/player/neilfinn.html



Sources:
http://www.thehobbit.com/
http://www.thehobbitblog.com/
http://music.yahoo.com/news/neil-finn-reaches-epic-heights-song-lonely-mountain-140043435.html

For Dogs to Learn Words It Is Size That Matters


Toddlers just learning to speak associate words with shape, not size or texture. Anything shaped like a telephone, for instance, might be called “phone.”


"Dali, my 12 years old golden retriever"


But a new study suggests that dogs tend to associate words with size rather than shape.
This difference makes it “very doubtful that there is a single mammalian feature in word learning,” said Emile van der Zee, a psychologist at the University of Lincoln in England and the first author of the study, which appears in the journal PLoS One. “This study may help us understand why humans are more special when it comes to learning language.”
The researchers worked with Gable, a 5-year-old Border collie with an understanding of more than 40 words. The dog was shown a horseshoe-shaped object that the scientists called a “dax.”
After some training, the dog began to identify other objects of similar size with the same name. After taking the object home for about a month, Gable also began to associate the word with other objects of similar texture, but never objects that were simply of similar shape.
The smells of the objects were kept neutral, but the results may differ if scent is incorporated, Dr. van der Zee said.
“That would be something that we would like to do in our future research,” he said, adding that he would like to repeat the study with other mammals, including pigs and primates.
 
Source:


Daily Report: Law Enforcement vs. Cellphone Privacy

Judges and lawmakers across the country are wrangling over whether and when law enforcement authorities can peer into suspects’ cellphones and the cornucopia of evidence they provide, Somini Sengupta reports in Monday’s New York Times.
A Rhode Island judge threw out cellphone evidence that led to a man being charged with the murder of a 6-year-old boy, saying the police needed a search warrant. A court in Washington compared text messages to voice mail messages that can be overheard by anyone in a room and are therefore not protected by state privacy laws.
In Louisiana, a federal appeals court is weighing whether location records stored in smartphones deserve privacy protection, or whether they are “business records” that belong to the phone companies.
“The courts are all over the place,” said Hanni Fakhoury, a criminal lawyer with the Electronic Frontier Foundation, a San Francisco-based civil liberties group. “They can’t even agree if there’s a reasonable expectation of privacy in text messages that would trigger Fourth Amendment protection.”
The issue will attract attention on Thursday when a Senate committee considers limited changes to the Electronic Communications Privacy Act, a 1986 law that regulates how the government can monitor digital communications. Courts have used it to permit warrantless surveillance of certain kinds of cellphone data.
A proposed amendment would require the police to obtain a warrant to search e-mail, no matter how old it was, updating a provision that currently allows warrantless searches of e-mails more than 180 days old.
As technology races ahead of the law, courts and lawmakers are still trying to figure out how to think about the often intimate data that cellphones contain, said Peter P. Swire, a law professor at Ohio State University. Neither the 1986 statute nor the Constitution, he said, could have anticipated how much information cellphones may contain, including detailed records of people’s travels and diagrams of their friends.
“It didn’t take into account what the modern cellphone has — your location, the content of communications that are easily readable, including Facebook posts, chats, texts and all that stuff,” Mr. Swire said.

Source:
http://bits.blogs.nytimes.com/2012/11/26/daily-report-law-enforcement-vs-cellphone-privacy/

Justice Department Expands Hunt for Data on Cellphones

Fans of “The Wire,” the HBO series, will recall what a gold mine cellphones turned out to be for police investigating a drug ring in Baltimore. Detectives in the show used them to construct a map of who called whom at what time and how often.
Indeed, a list of incoming and outgoing calls on an individual’s cellphone can provide a robust trail of evidence.
Cellphones seem to be increasingly attractive to the Department of Justice, documents obtained by the American Civil Liberties Union show. Agencies affiliated with the department used more than 37,600 court orders in 2011 to gather cellphone data, a sharp increase from previous years. They were almost equally divided between “pen register” data, which captures outgoing phone numbers, and “trap and trace” orders, which refer to incoming phone numbers, which means one phone could have two separate orders associated with it.
The total number has roughly doubled since 2007, when cellphone communications were more limited.
By law, the data can be obtained without a search warrant establishing probable cause, though the authorities do need to tell a court that it is relevant to an investigation. To get a wiretap that allows authorities to actually listen in on the contents of a call has higher legal barriers; law enforcement officials have to convince an impartial judge of probable cause.
The lower legal threshold allows law enforcement agencies to capture crucial information, including the time and date of calls and their length, helping law enforcement officials deduce important associations among callers. Each order, the A.C.L.U. pointed out, could affect one or more individuals.
Pen register orders can also allow law enforcement number to obtain data about e-mails, like the “to” and “from” fields, though not the content of those communications.
Among the total orders, the United States Marshals Service led the pack, with more than 16,000, followed by the Drug Enforcement Agency and the Federal Bureau of Investigation. The Justice Department, unlike local police, is required to report how many such orders it seeks. Still, the A.C.L.U. said it had to file a Freedom of Information Act request to obtain the latest figures.

Source:
http://bits.blogs.nytimes.com/2012/11/26/justice-department-expands-hunt-for-data-on-cellphones/