Category Archives: Technology

The agendas of security professionals in 2016

2014 was heralded as the ‘year of the data breach’ – but we’d seen nothing yet. From unprecedented data theft to crippling hacktivism attacks and highly targeted state-sponsored hacks, 2015 has been the bleakest year yet for the cyber security of businesses and organisations.

High profile breaches at Ashley Madison, TalkTalk and JD Wetherspoons have brought the protection of personal and enterprise data into the public consciousness.

In the war against cybercrime, companies are facing off against ever more sophisticated and crafty approaches, while the customer data they hold grows in value, and those that fail to protect it find themselves increasingly in the media and legislative spotlight with nowhere to hide.

We asked a panel of leading industry experts to highlight the major themes for enterprise cyber security in 2016 and beyond.

The increasing sophistication of DDoS attacks

There were many cyber security stories this year that raised eyebrows, but the one that stands out for many reasons is the DDoS attack and subsequent data breach of 157,000 customer records from UK telecoms provider TalkTalk.

When TalkTalk became the victim of its third cyber attack in the space of a year, it turned out to be the biggest British cyber attack on record, costing the firm an estimated £35 million.

Dave Larson, COO of Corero Network Security, believes that DDoS being utilised as a smokescreen for more nefarious data exfiltration is emerging as a more common component of reported breach activity.

TalkTalk’s apparent lack of DDoS protection in an Internet Service Provider environment hasn’t gone unnoticed, and has raised red flags across the security community.

‘The ever-growing number of tools for automating DDoS attacks means that companies will have the threat of several different attacks happening at once, or in rapid succession,’ explains Larson. ‘If the hackers are able to automate some of their attack activities that were more labour-intensive, then companies will begin to be overwhelmed by both the speed and frequency of new attacks – unless they have the appropriate mitigation solutions in place.’

‘DDoS has evolved from your typical volumetric scale attack to smaller, short duration surgical strike attacks causing security vulnerabilities as well as availability concerns for organisations.’

Internet Service Providers (ISP’s) must reconsider the legacy approach to DDoS mitigation, and implement more sophisticated and granular protection at the Internet edge, says Larson.

‘Legacy scrubbing centre operations, or black holing victim IP’s as a response to DDoS attacks will become techniques of the past for dealing with this challenging cybersecurity issue.’

Social malware as the new wild west

Phishing isn’t going anywhere, but we can expect an increase in sophistication: fraudulent sites offering faux-customer support resulting in a remote connection compromising users’ systems – or attackers even venturing to social media outlets like SnapChat and Instagram to broaden their reach.

‘Over the last 15 years we’ve seen web attacks evolve to a highly sophisticated state, mobile applications are five years into this same cycle,’ says Ben Harknett, VP EMEA at RiskIQ. ‘But it doesn’t stop there, next to join the cyber-attack evolutionary path is social media attacks. Whilst still in the early stages, attacks like this have huge potential in online fraud as demonstrated by the Metro Bank Twitter incident earlier this year.’

‘With a predictable pattern for how attacks evolve, we fully expect 2016 will see rapidly increasing maturity in attack vectors involving social media. Brace yourselves, the impact of viral, social media delivered, malware will be huge.’

The threat still lurking within organisations’ walls

Many businesses are getting serious about cyber security and adopting fundamentally new approaches to deal with today’s complex threat landscape. As Dave Palmer, director of technology at Darktrace explains, at the core of this shift is an acknowledgement that threat is, by default, inside our organisations, and must be kept in check by continual monitoring and advanced detection.

‘Companies need to accept that new reality and surmount the old idea that all threats can be blocked at the perimeter,’ says Palmer. ‘Insider threat will loom large over organisations in 2016.

However strong your perimeter is, or how well your staff are trained, insiders will always pose a significant risk. They are potentially the most damaging types of threats, as they use legitimate credentials and exploit their access in ways that are difficult to predict.’

And as networks grow and more devices become interconnected, the pool of potential insiders is getting larger too, spanning from your own employees, through to customers, suppliers or contractors who have access to parts of the corporate network.

Social engineering is one of the main mechanisms used to gain unauthorised access to systems. Brian Chappell, director of technical services EMEAI and APAC at identity management firm BeyondTrust explains, people remain a weak-link but they could also be your strongest resource.

‘Most organisations rely wholly on technological solutions to secure their networks giving their employees cursory consideration with occasional communication around the risks that their actions have,’ says Chappell. ‘We don’t need to scare our staff but rather look to educate, frequently, what they can do to help mitigate those risks. Your employees could be the biggest resource you have to protect your systems.’

Behavioural analytics and machine learning coming to the fore

DarkTrace’s Palmer argues that the single most important innovation in the cyber defence sector has been the new capability of machines to automatically learn what is normal and abnormal within a network, and identify in-progress cyber-attacks without having previously seen them before.

‘The most successful companies in proactive security have embraced this model of ‘immune system’ technology that continually looks out for abnormalities, and alerts the security team in real time, before the damage has been done,’ he says.

‘Cyber security will move further towards the boardroom as a corporate issue, rather than a problem for the IT department, and become a continual process of risk mitigation.’

Automation will be critical to this process, and machine learning will become a de facto technological approach to draw insights from large sets of noisy data, quickly, predicts Palmer.

Jonathan Sander, VP product strategy at Lieberman Software thinks this year we will see a major revolution in how firewalls are installed, configured and run that deliver more value with less human tuning.

‘Part of what takes humans out of the firewall tuning business will be collective experience being put in the box,’ says Sander. ‘Some of it will be the application of machine learning technologies and other nascent AI peeking out of the lab and impacting security operations. We know that the bad guys are using automated techniques to attack, and it’s only by putting our machines to fight theirs on the front line that we can hope to keep up.’

The CSO role getting a rethink

We’re seeing a growing number of CSO (Chief Security Officer) and CISO (Chief Information Security Officer) roles as a reaction to the current cyber security environment.

‘Every large company thinks they need to hire one, but often neither the business nor the CSO understand what their role should be, to whom they should report, or even what their performance or success should look like,’ says one CSO- Dave Baker from identity and device management firm Okta. ‘I have seen many cases where the CSO was recruited from the compliance world with little understanding of actual attack security. This seriously diminishes their effectiveness.’

As the white hat hacker community grows and enters the enterprise, Baker predicts the CSO role will see a much-needed evolution.

‘Hackers who understand technical details of attack security, but also have the business acumen to communicate with CEOs and convince CIOs of their importance will make a significant impact on the role of security professionals,’ says Baker.

‘Chris Wysopal, CTO of Veracode, as well as Alex Stamos, CSO of Facebook, are popular examples of a hacker-turned-executive – we’ll see more like them in coming years.’

Ransomware turns on the enterprise

As Michael Sutton, CISO at Security as Service platform Zscaler explains, Ransomware has hit a sweet spot.

‘Users are all too willing to begrudgingly pay an expensive but not excessive ransom in exchange for the return of their precious data,’ Sutton says. ‘Even the FBI are recommending that it’s easier to pay than fight. The wildly profitable CryptoLocker has attracted many clones since it was largely knocked offline following Operation Tovar.’

Many of these clones, including more popular variants such as CryptoWall and  TorrentLocker largely followed the proven formula, but we’re starting to see variations such as mobile and Linux focused ransomware.

‘The latter is especially important as it’s more likely to impact the websites and code repositories of enterprises, who in our experience are also very willing to pay up rather than risk losing critical intellectual property,’ says Sutton.

‘Expect ransomware to become increasingly corporate focused in 2016 and as it does, enterprises won’t get away with paying consumer rates. The criminals behind the ransomware campaigns are savvy and once they realise that they’ve locked up source code and financial documents that haven’t been properly backed up, you can expect prices to skyrocket … and be paid.’

 

Source : http://www.information-age.com

Software robotic enter the banking system

There is a whole new range of companies like Google, Apple and others starting to provide financial services focused on the most profitable products and services currently being provided by banks. Banks are experiencing new and disruptive challenges to their traditional business models and product offerings.

Banks are responding by offering similar services in a fashion now being expected by consumers, but they are facing huge challenges in adapting their existing technologies, legacy systems and business architectures. This is where Banking Robotic Software is making rapid change possible.

They traditional response to these new competitors would be to replace legacy IT systems with state of the art technologies. Unfortunately, replacing these core systems would take years and enormous amounts of capital and operational risk. An option banks are increasingly taking is to augment existing systems by applying Banking Robotic Software. This approach allows new products and services to be offered through new customer interfaces, with Robots processing the various transactions through the user interface of existing legacy applications. This approach would be less feasible if humans were required, however it becomes practical because Robots work at twice the speed of humans, 24 hours a day and 7 days a week. Since these Robots can work on virtual computers, they do not require any of the infrastructure that humans require and are not susceptible to the attendance, coaching and other management overheads required by humans.

Banking Robotic Software can be also be deployed much faster than large IT projects. Robots can be configured and trained on timelines similar to training a new staff person. However, once one Robot has been trained, all other Robots immediately receive the same training.

Banks are also applying Banking Robotic Software to their other business lines as a means of reducing cost and improving customer service. At a cost of less than $7.5K per year, Robots are also enabling companies to repatriate work from business process outsourcers, resulting in further cost savings and improved customer service.

 Insurance Robotic Software

Insurance Robotic Software is helping reduce cost, mitigate risks and improve customer service. Insurance policies are evolving rapidly to accommodate new situations and technologies, while the need remains to support legacy policies. In the past, companies relied on humans to retain knowledge and experience to these support legacy policies, however todays employment market is seeing a much higher turn-over of staff and the resultant risks associated with losing staff experienced in administering legacy policies.

Robots can be trained in very complex business logic and are therefore able to process claims across a full range of policies, quickly and efficiently, with no human errors. They also eliminate the risks of staff turnover.

The history of computer technology from the earliest discovered until 2015

Rapid development of computer technology to fast, and computer that you know now is a very smart thing, being able to do anything and helping people work becomes easier. Do you know the history of the computer first made until now? This article tell about the long trip computer technology from the earliest discovered until 2015 .

The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms.

Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers.

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world’s first computer was actually built.

1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.

1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.

1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.

1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FORTRAN programming language is born.

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, RadioShack’s TRS-80 —affectionately known as the “Trash 80” — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the “world’s first minicomputer kit to rival commercial models.” Two “computer geeks,” Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board.
TRS-80
[Pin It] The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks
Credit: Radioshack
View full size image

1977: Radio Shack’s initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International releases WordStar.
IBM’s first PC
[Pin It] The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system.
Credit: IBM
View full size image

1981: The first IBM personal computer, code-named “Acorn,” is introduced. It uses Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.

1983: Apple’s Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a “laptop.”

1985: Microsoft announces Windows, its response to Apple’s GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.

1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1994: PCs become gaming machines as “Command & Conquer,” “Alone in the Dark 2,” “Theme Park,” “Magic Carpet,” “Descent” and “Little Big Adventure” are among the games to hit the market.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple’s court case against Microsoft in which it alleged that Microsoft copied the “look and feel” of its operating system.

1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.

2003: The first 64-bit processor, AMD’s Athlon 64, becomes available to the consumer market.

2004: Mozilla’s Firefox 1.0 challenges Microsoft’s Internet Explorer, the dominant Web browsers. Facebook, a social networking site, launches.

2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo’s Wii game console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

 

source : http://www.livescience.com

The impact of technology on small businesses, such as twitter

Much has been made of how big companies like Dell, Starbucks and Comcast use Twitter to promote their products and answer customers’ questions. But today, small businesses outnumber the big ones on the free micro blogging service, and in many ways, Twitter is an even more useful tool for them.

For many mom-and-pop shops with no ad budget, Twitter has become their sole means of marketing. It is far easier to set up and update a Twitter account than to maintain a Web page. And because small-business owners tend to work at the cash register, not in a cubicle in the marketing department, Twitter’s intimacy suits them well.

“We think of these social media tools as being in the realm of the sophisticated, multiplatform marketers like Coca-Cola and McDonald’s, but a lot of these supersmall businesses are gravitating toward them because they are accessible, free and very simple,” said Greg Sterling, an analyst who studies the Internet’s influence on shopping and local businesses.

Small businesses typically get more than half of their customers through word of mouth, he said, and Twitter is the digital manifestation of that. Twitter users broadcast messages of up to 140 characters in length, and the culture of the service encourages people to spread news to friends in their own network.

Umi, a sushi restaurant in San Francisco, sometimes gets five new customers a night who learned about it on Twitter, said Shamus Booth, a co-owner.

He twitters about the fresh fish of the night — “The O-Toro (bluefin tuna belly) tonight is some of the most rich and buttery tuna I’ve had,” he recently wrote — and offers free seaweed salads to people who mention Twitter.
Photo
Curtis Kimball, owner of a crème brûlée cart in San Francisco, uses Twitter to drive his customers to his changing location. Credit Peter DaSilva for The New York Times

Twitter is not just for businesses that want to lure customers with mouth-watering descriptions of food. For Cynthia Sutton-Stolle, the co-owner of Silver Barn Antiques in tiny Columbus, Tex., Twitter has been a way to find both suppliers and customers nationwide.

Since she joined Twitter in February, she has connected with people making lamps and candles that she subsequently ordered for her shop and has sold a few thousand dollars of merchandise to people outside Columbus, including to a woman in New Jersey shopping for graduation gifts.

“We don’t even have our Web site done, and we weren’t even trying to start an e-commerce business,” Ms. Sutton-Stolle said. “Twitter has been a real valuable tool because it’s made us national instead of a little-bitty store in a little-bitty town.”

Scott Seaman of Blowing Rock, N.C., also uses Twitter to expand his customer base beyond his town of about 1,500 residents. Mr. Seaman is a partner at Christopher’s Wine and Cheese shop and owns a bed and breakfast in town. He sets up searches on TweetDeck, a Web application that helps people manage their Twitter messages, to start conversations with people talking about his town or the mountain nearby. One person he met on Twitter booked a room at his inn, and a woman in Dallas ordered sake from his shop.

The extra traffic has come despite his rarely pitching his own businesses on Twitter. “To me, that’s a turn-off,” he said. Instead of marketing to customers, small-business owners should use the same persona they have offline, he advised. “Be the small shopkeeper down the street that everyone knows by name.”

Chris Mann, the owner of Woodhouse Day Spa in Cincinnati, twitters about discounts for massages and manicures every Tuesday. Twitter beats e-mail promotions because he can send tweets from his phone in a meeting and “every single business sends out an e-mail,” he said.

Even if a shop’s customers are not on Twitter, the service can be useful for entrepreneurs, said Becky McCray, who runs a liquor store and cattle ranch in Oklahoma and publishes a blog called Small Biz Survival.

In towns like hers, with only 5,000 people, small-business owners can feel isolated, she said. But on Twitter, she has learned business tax tips from an accountant, marketing tips from a consultant in Tennessee and start-up tips from the founder of several tech companies.

Anamitra Banerji, who manages commercial products at Twitter, said that when he joined the company from Yahoo in March, “I thought this was a place where large businesses were. What I’m finding more and more, to my surprise every single day, is business of all kinds.”

Twitter, which does not yet make money, is now concentrating on teaching businesses how they can join and use it, Mr. Banerji said, and the company plans to publish case studies. He is also developing products that Twitter can sell to businesses of all sizes this year, including features to verify businesses’ accounts and analyze traffic to their Twitter profiles.

According to Mr. Banerji, small-business owners like Twitter because they can talk directly to customers in a way that they were able to do only in person before. “We’re finding the emotional distance between businesses and their customers is shortening quite a bit,” he said.

 

source : http://www.nytimes.com

The solar power more feasible for businesses and household that have large-scale power requirements

As per a recent report by Ernst & Young (E&Y), the prices of Solar Cells technology are falling at such a fast rate that by the year 2013 solar panels would cost just half of what they used to cost in 2009. The average one-times cost of installing solar PV panels was $2 per unit of generation capacity in 2009 and in 2011 it came down to $1.50. And, according to industry experts this rate of decline is going to continue and the prices would reach $1 in 2013.

 Feasibility

Currently, solar PV technology is economically feasible for homeowners and small businesses due to government aid offered in the form of Feed-in-Tariffs. However, as per the latest analysis it is suggested that the increasing fossil fuel prices and the declining Solar Cells prices would together make the solar installations cost-effective without government aid, even on a larger scale.

With this perspective, it has become apparent that the support from the government for different types of solar power systems over these years and even in the near future made proper economic sense. On similar lines, another report displayed that the month to month spot price of silicon fell by 28%. This is important because silicon is the basic material used in making majority of Solar Cells.

 The Government Perspective

While private and independent bodies have reported this drastic drop in the price of solar power technology in the future, the government advisory bodies have something else to say. According to the Committee on Climate Change (CCC) in the United Kingdom, solar power still remains expensive for being considered in the near future. However, according to E&Y, the government perspective fails to look at the wide economic benefits associated with Solar Cells.

Comparison

Dollar price per water of peak power capacity is taken as the standard for comparing the cost of power generation by different energy sources. This comparison between solar power and other energy sources can be done on the basis of following factors:

  • Upfront expenses
  • Fuel prices
  • Discount rates
  • Maintenance cost

According to the report, regular support from the government in the short-term would bring down the levelized cost of large-scale solar energy production at par with the retail price of energy in the period between 2016 and 2019. This is a clear suggestion that in just a decade, businesses with huge power demands would be able to install unsubsidized Solar Cells systems rather than buy power from the grid.