Category Archives: Computer

Accounting Software To Manage Your Business Financial Tasks

Adding accounting software can help business owners to deal with any complexity when it comes to financial tasks. Intuit Quickbooks as your accounting software has information that you need to keep your business on track. Choosing the right accounting software it is not an easy tasks as the options are plenty. Looking for a worth considering accounting software, several essential factor to put in mind are its features, its price, whether both of them meet your business or not and customer care . Get your business covered with reliable accounting software is exactly what you need, and there are many reasons why you should consider the aforesaid.

Have a small business, it doesn’t mean that you won’t meet with the heftiness of financial tasks. Accounting software like Quickbooks can help you to have full control toward basic financial functions like keeping track toward business expense, invoicing, inventory, and more. Tracking expense and categorize each of them will lead to hassle, but it is not when you choose a proper accounting software. Dealing with inventory and analyze the probability of profit, this software system can do it seamlessly. Do you have problem in managing cash flow?

You can minimize the hassle to get immediate information about clash flow like due date payment of your customer, your bills, and many more. To make a successful business, you have to convince your customers, not only with top quality of your product or service, but in the way you present yourself. Professional presentation is essential, even for something like sales estimate, invoice, and sales receipt. Make your invoice online, and let both you and your clients experience its benefits. You get your payment in instant, and when it comes to your customers, you enhance their experience as you cut down the hassle for the payment process.

Get immediate report about your business like its balance sheet, its profit and loss, and many other more, Quickbooks makes it possible for you. Say goodbye to the hefty situation when it comes to tax. This accounting software can reduce your accountant workloads for easy access of information related to your financial record and other related information. That is not only thing, the fact that you know the estimation of tax that you should pay, the tax rates, and so on, it assists you to manage the tax. Its features alone, you are covered with Quickbooks Support Phone.

No matter how good the system is, error or the software can’t function properly is not zero. In case that something bad occurs to your system, the support team of Quickbooks will instantly fix the root of the problem. If your Enterprise version engages with poor performance, make sure that you directly contact QuickBooks Enterprise Support Number, so then, your matter can be solved immediately. Quickbooks is an ideal accounting software your business needs to keep on track of business financial tasks. Should you consider this one? Still unsure about your decision, free trial version of this accounting software can help you to decide.

The new technology a self-driving car still needs development

This accident, the first caused by a self-driving car after nearly 2.4 million kilometres of autonomous driving, suggests that Google’s vehicles still need more common sense. A human driver would probably have just driven over the sandbags.

Researchers at the University of La Laguna in the Canary Islands are building a system which might have helped Google’s Lexus see the sandbags for the non-threat they were. They use Microsoft Kinect cameras, originally developed for the Xbox One gaming console, to improve self-driving cars’ obstacle avoidance at close range.

Self-driving cars typically use a combination of sensors to detect and avoid obstacles. Radar and laser-based lidar systems are generally used for objects at long range, while ultrasonic detectors and stereo cameras sense cars and pedestrians closer in.

Obstacles close to the ground like ramps, kerbs and sandbags are difficult to make out, says Javier Hernandez-Aceituno, lead author of the study: “Laser-based sensors are not suitable for this task because they detect ramps as obstacles. Ultrasonic sensors are also unsuitable due to their low precision.”

Hernandez-Aceituno decided to try using a Kinect, a depth-sensing camera that uses an infrared laser to capture an instantaneous 3D map of objects up to about 4 metres away.

Last year, he and his colleagues installed a Kinect on an experimental golf cart called Verdino. A low-speed self-driving vehicle, Verdino is also equipped with laser rangefinders and stereo cameras from a PlayStation 4.

They set the Verdino loose on an outdoor course with ramps, kerbs and stairs, using one obstacle detection program to process data from all the sensors.

Misjudged ramp
The laser rangefinder ignored the lowest steps and incorrectly decided that the ramp was too steep to navigate. The camera gave inaccurate results for very near and far obstacles, and suffered from false detections.

The Kinect, however, produced more accurate results and fewer false positives than the stereo camera. Its biggest problem was spurious obstacles created by reflections of sunlight, although Hernandez-Aceituno was able to filter these out.

“The Kinect sensor vastly outperforms stereo vision at accurately detecting obstacles on the road,” says Hernandez-Aceituno, “[and] allows an autonomous vehicle to navigate safely in areas where laser rangefinders cannot detect obstacles.”

History of Computer Aided Manufacturing ( CAM )

Computer Aided Manufacturing (CAM) is one of the software automation processes that directly convert the product drawing or the object into the code design that enables the machine to manufacture the product. It is used in various machines like lathes and milling machines for the product manufacturing purposes. It allows the computer work instructions to communicate directly to the manufacturing machines.

The mechanism of CAM developed from the Computer Numerical Machines (CNC) in the early 1950s. These systems were directed by a set of coded instructions in a punched paper tape. The proposal to develop the first numerically controlled machine was commissioned to the Massachusetts Institute of Technology (MIT) from the US Air Force in the year 1949. The entire proposed idea of developing this machine was demonstrated in the year 1952.

The motivation factor in developing these kinds of numerically automated machines involves the expensive costs in manufacturing the complex curved geometries in 2D or 3D constraints mechanically. The development of these machines considers the factors like easier programming in CAM and easy storage of programs. A program can be changed easily and avoid manual errors. Numerically controlled machines are safer to operate, and the complex geometry comes at a reasonable price.

James T. Parsons proposed the concept of the numerical control operations during the year of 1948. In 1950, the MIT servo mechanism lab developed the Numerical Control (NC) milling project. The remaining program parts were released in the later period of 1952 along with the first successful demo version. After 1955, major companies in the industry developed their own machine designs.

IBMs Automatic Tool Changer in the year 1955, G & Ls first production of the skin-miller in the year 1957, and the machining center developed by K & T have all been considered to be major developments to promote the technology with more benefits. CAD drafting and the sculptured surfaces were developed in the year 1965; 7,700 NCs were also installed during the same year.

During the year 1967, the concept of developing the CNC machine was proposed. The existence and the major development of the CAD/CAM machines evolved during the year 1972. 3D CAM/CAD systems were introduced in 1976. Expert CAM/CAD systems were developed in the year 1989. The major development of the CAM systems provides you with easier manufacturing of objects with high efficiency.

The agendas of security professionals in 2016

2014 was heralded as the ‘year of the data breach’ – but we’d seen nothing yet. From unprecedented data theft to crippling hacktivism attacks and highly targeted state-sponsored hacks, 2015 has been the bleakest year yet for the cyber security of businesses and organisations.

High profile breaches at Ashley Madison, TalkTalk and JD Wetherspoons have brought the protection of personal and enterprise data into the public consciousness.

In the war against cybercrime, companies are facing off against ever more sophisticated and crafty approaches, while the customer data they hold grows in value, and those that fail to protect it find themselves increasingly in the media and legislative spotlight with nowhere to hide.

We asked a panel of leading industry experts to highlight the major themes for enterprise cyber security in 2016 and beyond.

The increasing sophistication of DDoS attacks

There were many cyber security stories this year that raised eyebrows, but the one that stands out for many reasons is the DDoS attack and subsequent data breach of 157,000 customer records from UK telecoms provider TalkTalk.

When TalkTalk became the victim of its third cyber attack in the space of a year, it turned out to be the biggest British cyber attack on record, costing the firm an estimated £35 million.

Dave Larson, COO of Corero Network Security, believes that DDoS being utilised as a smokescreen for more nefarious data exfiltration is emerging as a more common component of reported breach activity.

TalkTalk’s apparent lack of DDoS protection in an Internet Service Provider environment hasn’t gone unnoticed, and has raised red flags across the security community.

‘The ever-growing number of tools for automating DDoS attacks means that companies will have the threat of several different attacks happening at once, or in rapid succession,’ explains Larson. ‘If the hackers are able to automate some of their attack activities that were more labour-intensive, then companies will begin to be overwhelmed by both the speed and frequency of new attacks – unless they have the appropriate mitigation solutions in place.’

‘DDoS has evolved from your typical volumetric scale attack to smaller, short duration surgical strike attacks causing security vulnerabilities as well as availability concerns for organisations.’

Internet Service Providers (ISP’s) must reconsider the legacy approach to DDoS mitigation, and implement more sophisticated and granular protection at the Internet edge, says Larson.

‘Legacy scrubbing centre operations, or black holing victim IP’s as a response to DDoS attacks will become techniques of the past for dealing with this challenging cybersecurity issue.’

Social malware as the new wild west

Phishing isn’t going anywhere, but we can expect an increase in sophistication: fraudulent sites offering faux-customer support resulting in a remote connection compromising users’ systems – or attackers even venturing to social media outlets like SnapChat and Instagram to broaden their reach.

‘Over the last 15 years we’ve seen web attacks evolve to a highly sophisticated state, mobile applications are five years into this same cycle,’ says Ben Harknett, VP EMEA at RiskIQ. ‘But it doesn’t stop there, next to join the cyber-attack evolutionary path is social media attacks. Whilst still in the early stages, attacks like this have huge potential in online fraud as demonstrated by the Metro Bank Twitter incident earlier this year.’

‘With a predictable pattern for how attacks evolve, we fully expect 2016 will see rapidly increasing maturity in attack vectors involving social media. Brace yourselves, the impact of viral, social media delivered, malware will be huge.’

The threat still lurking within organisations’ walls

Many businesses are getting serious about cyber security and adopting fundamentally new approaches to deal with today’s complex threat landscape. As Dave Palmer, director of technology at Darktrace explains, at the core of this shift is an acknowledgement that threat is, by default, inside our organisations, and must be kept in check by continual monitoring and advanced detection.

‘Companies need to accept that new reality and surmount the old idea that all threats can be blocked at the perimeter,’ says Palmer. ‘Insider threat will loom large over organisations in 2016.

However strong your perimeter is, or how well your staff are trained, insiders will always pose a significant risk. They are potentially the most damaging types of threats, as they use legitimate credentials and exploit their access in ways that are difficult to predict.’

And as networks grow and more devices become interconnected, the pool of potential insiders is getting larger too, spanning from your own employees, through to customers, suppliers or contractors who have access to parts of the corporate network.

Social engineering is one of the main mechanisms used to gain unauthorised access to systems. Brian Chappell, director of technical services EMEAI and APAC at identity management firm BeyondTrust explains, people remain a weak-link but they could also be your strongest resource.

‘Most organisations rely wholly on technological solutions to secure their networks giving their employees cursory consideration with occasional communication around the risks that their actions have,’ says Chappell. ‘We don’t need to scare our staff but rather look to educate, frequently, what they can do to help mitigate those risks. Your employees could be the biggest resource you have to protect your systems.’

Behavioural analytics and machine learning coming to the fore

DarkTrace’s Palmer argues that the single most important innovation in the cyber defence sector has been the new capability of machines to automatically learn what is normal and abnormal within a network, and identify in-progress cyber-attacks without having previously seen them before.

‘The most successful companies in proactive security have embraced this model of ‘immune system’ technology that continually looks out for abnormalities, and alerts the security team in real time, before the damage has been done,’ he says.

‘Cyber security will move further towards the boardroom as a corporate issue, rather than a problem for the IT department, and become a continual process of risk mitigation.’

Automation will be critical to this process, and machine learning will become a de facto technological approach to draw insights from large sets of noisy data, quickly, predicts Palmer.

Jonathan Sander, VP product strategy at Lieberman Software thinks this year we will see a major revolution in how firewalls are installed, configured and run that deliver more value with less human tuning.

‘Part of what takes humans out of the firewall tuning business will be collective experience being put in the box,’ says Sander. ‘Some of it will be the application of machine learning technologies and other nascent AI peeking out of the lab and impacting security operations. We know that the bad guys are using automated techniques to attack, and it’s only by putting our machines to fight theirs on the front line that we can hope to keep up.’

The CSO role getting a rethink

We’re seeing a growing number of CSO (Chief Security Officer) and CISO (Chief Information Security Officer) roles as a reaction to the current cyber security environment.

‘Every large company thinks they need to hire one, but often neither the business nor the CSO understand what their role should be, to whom they should report, or even what their performance or success should look like,’ says one CSO- Dave Baker from identity and device management firm Okta. ‘I have seen many cases where the CSO was recruited from the compliance world with little understanding of actual attack security. This seriously diminishes their effectiveness.’

As the white hat hacker community grows and enters the enterprise, Baker predicts the CSO role will see a much-needed evolution.

‘Hackers who understand technical details of attack security, but also have the business acumen to communicate with CEOs and convince CIOs of their importance will make a significant impact on the role of security professionals,’ says Baker.

‘Chris Wysopal, CTO of Veracode, as well as Alex Stamos, CSO of Facebook, are popular examples of a hacker-turned-executive – we’ll see more like them in coming years.’

Ransomware turns on the enterprise

As Michael Sutton, CISO at Security as Service platform Zscaler explains, Ransomware has hit a sweet spot.

‘Users are all too willing to begrudgingly pay an expensive but not excessive ransom in exchange for the return of their precious data,’ Sutton says. ‘Even the FBI are recommending that it’s easier to pay than fight. The wildly profitable CryptoLocker has attracted many clones since it was largely knocked offline following Operation Tovar.’

Many of these clones, including more popular variants such as CryptoWall and  TorrentLocker largely followed the proven formula, but we’re starting to see variations such as mobile and Linux focused ransomware.

‘The latter is especially important as it’s more likely to impact the websites and code repositories of enterprises, who in our experience are also very willing to pay up rather than risk losing critical intellectual property,’ says Sutton.

‘Expect ransomware to become increasingly corporate focused in 2016 and as it does, enterprises won’t get away with paying consumer rates. The criminals behind the ransomware campaigns are savvy and once they realise that they’ve locked up source code and financial documents that haven’t been properly backed up, you can expect prices to skyrocket … and be paid.’

 

Source : http://www.information-age.com

The history of computer technology from the earliest discovered until 2015

Rapid development of computer technology to fast, and computer that you know now is a very smart thing, being able to do anything and helping people work becomes easier. Do you know the history of the computer first made until now? This article tell about the long trip computer technology from the earliest discovered until 2015 .

The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms.

Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers.

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world’s first computer was actually built.

1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.

1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.

1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.

1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FORTRAN programming language is born.

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, RadioShack’s TRS-80 —affectionately known as the “Trash 80” — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the “world’s first minicomputer kit to rival commercial models.” Two “computer geeks,” Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board.
TRS-80
[Pin It] The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks
Credit: Radioshack
View full size image

1977: Radio Shack’s initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International releases WordStar.
IBM’s first PC
[Pin It] The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system.
Credit: IBM
View full size image

1981: The first IBM personal computer, code-named “Acorn,” is introduced. It uses Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.

1983: Apple’s Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a “laptop.”

1985: Microsoft announces Windows, its response to Apple’s GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.

1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1994: PCs become gaming machines as “Command & Conquer,” “Alone in the Dark 2,” “Theme Park,” “Magic Carpet,” “Descent” and “Little Big Adventure” are among the games to hit the market.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple’s court case against Microsoft in which it alleged that Microsoft copied the “look and feel” of its operating system.

1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.

2003: The first 64-bit processor, AMD’s Athlon 64, becomes available to the consumer market.

2004: Mozilla’s Firefox 1.0 challenges Microsoft’s Internet Explorer, the dominant Web browsers. Facebook, a social networking site, launches.

2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo’s Wii game console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

 

source : http://www.livescience.com

The impact of technology on small businesses, such as twitter

Much has been made of how big companies like Dell, Starbucks and Comcast use Twitter to promote their products and answer customers’ questions. But today, small businesses outnumber the big ones on the free micro blogging service, and in many ways, Twitter is an even more useful tool for them.

For many mom-and-pop shops with no ad budget, Twitter has become their sole means of marketing. It is far easier to set up and update a Twitter account than to maintain a Web page. And because small-business owners tend to work at the cash register, not in a cubicle in the marketing department, Twitter’s intimacy suits them well.

“We think of these social media tools as being in the realm of the sophisticated, multiplatform marketers like Coca-Cola and McDonald’s, but a lot of these supersmall businesses are gravitating toward them because they are accessible, free and very simple,” said Greg Sterling, an analyst who studies the Internet’s influence on shopping and local businesses.

Small businesses typically get more than half of their customers through word of mouth, he said, and Twitter is the digital manifestation of that. Twitter users broadcast messages of up to 140 characters in length, and the culture of the service encourages people to spread news to friends in their own network.

Umi, a sushi restaurant in San Francisco, sometimes gets five new customers a night who learned about it on Twitter, said Shamus Booth, a co-owner.

He twitters about the fresh fish of the night — “The O-Toro (bluefin tuna belly) tonight is some of the most rich and buttery tuna I’ve had,” he recently wrote — and offers free seaweed salads to people who mention Twitter.
Photo
Curtis Kimball, owner of a crème brûlée cart in San Francisco, uses Twitter to drive his customers to his changing location. Credit Peter DaSilva for The New York Times

Twitter is not just for businesses that want to lure customers with mouth-watering descriptions of food. For Cynthia Sutton-Stolle, the co-owner of Silver Barn Antiques in tiny Columbus, Tex., Twitter has been a way to find both suppliers and customers nationwide.

Since she joined Twitter in February, she has connected with people making lamps and candles that she subsequently ordered for her shop and has sold a few thousand dollars of merchandise to people outside Columbus, including to a woman in New Jersey shopping for graduation gifts.

“We don’t even have our Web site done, and we weren’t even trying to start an e-commerce business,” Ms. Sutton-Stolle said. “Twitter has been a real valuable tool because it’s made us national instead of a little-bitty store in a little-bitty town.”

Scott Seaman of Blowing Rock, N.C., also uses Twitter to expand his customer base beyond his town of about 1,500 residents. Mr. Seaman is a partner at Christopher’s Wine and Cheese shop and owns a bed and breakfast in town. He sets up searches on TweetDeck, a Web application that helps people manage their Twitter messages, to start conversations with people talking about his town or the mountain nearby. One person he met on Twitter booked a room at his inn, and a woman in Dallas ordered sake from his shop.

The extra traffic has come despite his rarely pitching his own businesses on Twitter. “To me, that’s a turn-off,” he said. Instead of marketing to customers, small-business owners should use the same persona they have offline, he advised. “Be the small shopkeeper down the street that everyone knows by name.”

Chris Mann, the owner of Woodhouse Day Spa in Cincinnati, twitters about discounts for massages and manicures every Tuesday. Twitter beats e-mail promotions because he can send tweets from his phone in a meeting and “every single business sends out an e-mail,” he said.

Even if a shop’s customers are not on Twitter, the service can be useful for entrepreneurs, said Becky McCray, who runs a liquor store and cattle ranch in Oklahoma and publishes a blog called Small Biz Survival.

In towns like hers, with only 5,000 people, small-business owners can feel isolated, she said. But on Twitter, she has learned business tax tips from an accountant, marketing tips from a consultant in Tennessee and start-up tips from the founder of several tech companies.

Anamitra Banerji, who manages commercial products at Twitter, said that when he joined the company from Yahoo in March, “I thought this was a place where large businesses were. What I’m finding more and more, to my surprise every single day, is business of all kinds.”

Twitter, which does not yet make money, is now concentrating on teaching businesses how they can join and use it, Mr. Banerji said, and the company plans to publish case studies. He is also developing products that Twitter can sell to businesses of all sizes this year, including features to verify businesses’ accounts and analyze traffic to their Twitter profiles.

According to Mr. Banerji, small-business owners like Twitter because they can talk directly to customers in a way that they were able to do only in person before. “We’re finding the emotional distance between businesses and their customers is shortening quite a bit,” he said.

 

source : http://www.nytimes.com