CES Day Two: Autonomous Vehicles Driving Forward, Securing Connected Cars

CES Day Two: Autonomous Vehicles Driving Forward, Securing Connected Cars

The first day of CES saw Intel capture headlines with its keynote, talk of autonomous vehicles and a world-record breaking drone display.

Day two featured Qualcomm in the spotlight, as WIRED profiles its autonomous vehicle plans, which is positioning itself to deliver the chips and wireless 5G networks that will enable a new wave of technology. 

Harman has plans to help secure connected cars.

Amazon continued to gain mindshare with its AI assistant Alexa, which it plans to integrate into Toyota (and Lexus). Yet, Toyota remains the only manufacturer without plans to integrate Google or Apple into its cars.

Google continued to make headlines (and not just for the raincheck on its outdoor exhibit) with its integration of Google Assistant into Android Auto. Kia is one manufacturer planning to integtrate Google into its cars.

Triple A will begin testing autonomous cars for Torq.

Ford and Autonomic made news with their smart city plans. Ford CEO also provides a well-timed interview with Fast Company to discuss the future of cars.

Meanwhile, outside of CES, Toyota and Mazda have plans to build a $1.6 B plant in Alabama. While GM considers how to make electric vehicles profitable. Finally, Renault-Nissan-Mitsubishi have launched a $1B corporate VC fund.

CES 2018: AI Assistants and Autonomous Automobiles...oh my!

CES 2018: AI Assistants and Autonomous Automobiles...oh my!

A round-up of CES 2018 news.

If you only read one story, read this AP round-up of news, much of which is captured below. Auto news includes Toyota's concept and a Bosch solution to identify vacant parking spaces.

Google is dominating CES, bringing its AI assistant to connected cars. In the future, Google plans to work directly with auto manufacturers. Side note: Google (and others) are now working on self-doubting AI to improve decision making.

Intel revealed its first self-driving car, the pay-off of its acquisition of Mobileye in 2017. The chip manufacturer also announced it is working with BMW, Nissan and VW to develop high-definition maps for use in autonomous vehicles.

NVIDIA adds Uber and VW to it's roster of self-driving car customers. The NVIDIA Drive IX platform is an autonomous AI SDK.

Aurora, an autonomous start-up (with founders from Google, Tesla and Uber) has signed deals with VW and Hyundai. The partnership will pair Aurora sensors into the vehicles.

Toyota introduced the e-Palette, an autonomous concept car, which mixes mobility and commerce, ZDNet reports. More from The Verge and from WIRED.

Toyota also revealed "Platform 3.0" which eschews the boxy look of e-Palette for a more traditional, almost elegant, autonomous driving experience, Mashable reports.

Ford announced plans for its autonomous vehicle fleet, including partnerships with Lyft, Dominoes and Postmates.

Beyond CES, MIT Technology Review reports on Baidu's self-driving car program, which offers its Apollo operating systems to partners for free. It currently has 90 partners, including auto manufacturers and chip makers.

It’s not rocket science: effective PR plans require research

It’s not rocket science: effective PR plans require research

PR professionals can learn something from Carl Sagan, who famously quipped, “if you wish to make apple pie from scratch, you must first create the universe.”

The good news is that PR pros do not need to literally create the universe to craft an effective PR plan; however, each PR plan must be made from scratch because every situation is unique. There is a common process that can be implemented to guide their creation. One fundamental aspect of this process is to take the time to do your research, so that you can develop an informed opinion.

Certainly, crafting a PR plan is a research and development process. PR professionals often struggle with their plans because they skimp on the research, leaving themselves uninformed. In order to see the bigger picture, make sure you understand your company, your competition, and your partners.

There are many resources available to develop this picture, from discussions with internal stakeholders to content from third party thought leaders. The best PR plans are informed by as much of this research as possible. 

Regardless of if you work in-house or for an agency, informational interviews are the most obvious place to begin. Your goal should be to understand the business model, its challenges and its objectives - who are you trying to reach, and what is your call to action? Ideally, you will be able to connect directly with the executive team, an exercise that occurs in many agency kick-off meetings, to discuss industry trends, product roadmap, partnerships, and events - all activities to develop a timeline for anchor announcements.

With this understanding, seek to better understand the competition. For technology PR professionals, the most direct approach is to leverage existing industry analyst reports. Not only does this quickly identify your competition, it also identifies key stakeholders for analyst relations and aligns with their broader industry vision.

Even without client access, anyone can search and review report titles, summaries and their table of contents. If your company is not included in any analyst research that should be addressed in your plan, but competitive intelligence can still be gained through review and recommendation sites. You can also call upon existing media relationships to learn who they consider your competition and what they think of them.

Equipped with a list of competitors, audit each of their websites and social media accounts to determine what sort of content they are producing and where they are placing it. Navigate their news to identify media targets and trends. Download their resources, such as research reports, to determine if you can produce correlated content. Visit their social media account to research relevant hashtags and see who they are following. You want to know your competitors messages - not to copy them, but to build upon them.

In Crossing the Chasm, Geoffrey Moore offers some pertinent advice, “In bringing this story to the business press, it is important to bring along as many of the other players in the market as possible.”

Practically, this means involving partners. Tap into customers to tell their story. Leverage business partners or investors to talk about bigger picture trends. Look for opportunities to engage with authors and academics that write about these trends. Repeat your media audit for each third party you identify.

Individually, each reference point has the potential to introduce another. Collectively, they provide the voice needed to discuss the bigger picture. For PR professionals that understand how to use their resources, the research and development of a PR plan is a piece of cake -- or an apple pie from scratch.
 

It’s already June and nothing is hotter than AI

1 Comment

It’s already June and nothing is hotter than AI

We are six months into 2017 and though the summer sun has started beating down on us, there is nothing hotter than AI. The most visible application of AI may be personal assistants with cute names, but Alexa, Cortana and Siri are far from the limitations of what can be accomplished.

This month, Apple announced plans to introduce a machine learning API for its developers called CoreML. Some called the move a ‘catch-up’ to Google, which recently announced TensorFlowLight. Both APIs will extend machine learning to the phone, enabling faster and more powerful applications.

The reality is that Apple and Google have both been developing strong AI capabilities for quite some time. Neither of them are playing catch-up, but rather are leading the pack, along with other major players (Amazon, Facebook and Microsoft jump to mind).

Of course, the real value of AI for these major players is building vast data sets to train their own neural networks. As we continue to see more applications utilize machine learning, we should likewise see improvements to their underlying infrastructure.

GPUs matter

Recently, Google announced its Cloud TPU, which can process 180 trillion floating point operations per second. Up until now, chip manufacturer nVidia had the lead in the market, as its powerful graphic processors served a dual-use training neural networks. However, the cost to lease such processing power can become prohibitively expensive, as we are reaching the end of Moore’s Law. The Google Cloud TPU represents a significant shift forward, cutting neural network training that would take one day down to six hours.

The fact that Google has produced a powerful AI chip that is only available through the cloud does lead to “democratize” AI, but it still keeps the real innovation in the hands of its gatekeepers. If you cannot leave the sandbox, you cannot make glass.

Even as Google and its cohorts reap the benefits of AI development, the future is very bright. Some of the smartest developers in the world are at these organizations, and if these recent revelations are any indication of their research and development capabilities, there are more major projects in the pipeline.

The benefits of AI are innumerable, particularly as automation enters the picture. AI can lead to improved decision making since AI is not biased. And as we turn our attention to the next generation of personal assistants, we will find that AI is the new UI. For the past decade, the future has been limited by the constraints of our technology, but as we look to the future we will find the only constraint is ourselves and what we dream to create.

1 Comment

Hacking Team, Ransomware, and Virtualization-Enhanced Security

1 Comment

Hacking Team, Ransomware, and Virtualization-Enhanced Security

Originally published on CSO Outlook

The largest organizations in the world are facing thousands of attacks a day across multiple attack vectors with the target of breaching sensitive and valuable data. Many of these attacks begin on the endpoint, through phishing emails, watering hole attacks, drive by downloads and zero-day attacks. However, even the most general organizations in the world still need to worry about these same endpoint attacks because cyber criminals are indiscriminate. Two of the most recent and most urgent security threats are malvertising and ransomware.

"Malvertising is highly effective because cyber criminals can target their attacks to specific demographics and deliver them with tremendous volume"

Malvertising is highly effective because cyber criminals can target their attacks to specific demographics and deliver them with tremendous volume. The online advertising model is such that ad networks simply cannot verify the validity of each and every advertisement it serves, which ultimately passes the cost of security onto security teams.

Ransomware is a highly pernicious attack; the initial compromise may occur through any number of exploits, but the end result is the encryption of all files on a system. These attacks demand payment for the key to unencrypt these locked files. Depending on the value of the encrypted data, organizations may feel compelled to pay the ransom, but making a payment only encourages these attacks to continue.

In order to prevent malvertisements, ransomware and other endpoint attacks, organizations should deploy strong endpoint protection. Most traditional endpoint protection solutions are failing because they rely on detection, which allow many attacks to succeed. Instead, organizations should investigate proactive protection, such as endpoint threat isolation or virtualization enhanced security. Additionally, ad-blocking browser extensions can be a highly effective way of mitigating malvertising attacks. Ransomware is much more difficult to mitigate, but frequent back-ups of valuable data can make remediation much easier. 

Vulnerable software remains one of the greatest threats that organizations must face, which is compounded by the naivety of the end user. Vulnerable software can be patched, but the end user cannot be patched. Unfortunately, many security teams are unable to patch vulnerable software as quickly as they would like because of cross-functional politics with operations teams that are tasked with uptime.For example, a recent Bromium survey conducted at Black Hat determined that 90 percent of security professionals believe that disabling Flash would make their organization more secure, yet 41 percent believe that disabling Flash would make their organization more secure, yet 41 percent believe that disabling Flash would break critical applications.

This illustrates the natural tension between security teams and operations, which is why one-in-five organizations take more than a month to deploy patches. When organizations take a month to deploy patches, but cyber criminals create exploits in the first day, an organization is left very vulnerable. The vulnerability of unpatched endpoint systems is exacerbated by end users with a propensity to click on anything. Cyber criminals may not even need to rely on zero day attacks if an unaware user with an unpatched machine visits a malicious website or opens a malicious attachment.

It may seem mundane, but the biggest threats are also the most common. Cyber criminals will continue to attack the endpoint because organizations are slow to patch vulnerable software and end user behaviour is unpredictable. Traditional endpoint protection systems will fail to prevent these threats because they are enhanced on detection. The best way to prevent these attacks is through proactive protection, such as threat isolation or virtualization based security, which can enable a user to click on anything, even on an unpatched endpoint, without compromise.

One of the biggest stories of 2015 was the Hacking Team breach, which revealed the market of offensive malware and exploit kits. An analysis of one Hacking Team exploit kit revealed very sophisticated capabilities of a remote access Trojan (RAT), which could surreptitiously record Skype calls, log cookie sessions and even key strokes.

These Hacking Team revelations also identified one of many Flash zero-days in 2015, which resulted Mozilla temporarily disabling Flash in the Firefox browser. Flash has become so problematic (in part because it is so ubiquitous) that Amazon and Google have decided to intelligently pause or disable some Flash advertisements, while Facebook has called for (but not yet implemented) an outright block of all Flash.

The Hacking Team story of 2015 warns us of weaponized malware; it makes real the dark underbelly of information security. There are malicious actors in this world that create and disseminate the tools to penetrate security solutions, a trend that will certainly continue until software becomes less vulnerable, organizations shift to more proactive protection, or both.

Many organizations are quite serious about cyber security and this is not a new trend. Cyber security spending has been increasing year over year for more than a decade. New solutions are constantly being introduced to the market, yet new attacks are constantly developed to circumvent them. However, there certainly is inertia with many organizations that continue to invest in traditional security solutions. This is the psychology of insecurity: no one ever gets fired for investing in traditional solutions, that is, until they get breached. 

Unfortunately, there is a fatalistic mantra with-in the security industry that “you will be breached” or “you have already been compromised,” which is ultimately a self-fulfilling and selfdefeating prophecy. Security vendors are trading on fear, uncertainty and doubt (FUD) to sell yet another solution that will fail to prevent an attack, only detecting it after it succeeds.

The greatest roadblock to change is the fear of change itself. There are solutions that have shifted the security paradigm from detect and respond to prevent and protect. Many organizations are already beginning to migrate to these solutions. As these organizations see success with these solutions, more and more organizations will follow their lead. In 2016, we will likely see the status quo continue to prevail (unfortunately), but more organizations will begin to make the change to a new genesis of security (fortunately!).

The emergence of endpoint threat isolation has been a breakthrough for information security. Virtualization based security enables the ability to segregate sensitive system fills from unknown and untrusted web sites, documents and processes. Earlier this year, Microsoft and Bromium announced a partnership to deliver the world’s most secure endpoint by enhancing the virtualization based security of Windows 10, which Microsoft is adopting as a fundamental security technology. Expect to see more widespread deployments of virtualization based security to enable endpoint threat isolation in 2016.

The beauty of virtualization-based security is that information security teams can still protect vulnerable systems that are out of their control. Once an endpoint is protected with a threat isolation solution that is enabled by virtualization-based security, it does not matter if a system is unpatched or vulnerable because all user processes, Web sites and documents are quite literally separated from the host system all the way down to the chipset. By using virtualization, potential threats run in parallel to the host and can never intersect to compromise the system. Virtualization-based security enables any user to click on anything on any network, without compromise.

1 Comment

The IT security vicious cycle of “Assuming Compromise”

2 Comments

The IT security vicious cycle of “Assuming Compromise”

Originally published on ITProPortal on February 10, 2015

When you walk the floors of industry trade shows and speak with IT security vendors, one of the most predominant endpoint security myths is “assume you will be compromised.” Of course, this is a fallacy, but as a result of this axiom, the security industry has become obsessed with detection, but at the cost of less protection.

Unfortunately, there are a lot of shortcomings with an IT security model based on detection. Take the Target data breach, for example. By all accounts, Target had deployed technology that did detect the attacks against it, yet it did nothing to remediate the situation.

The reason this myth persists is because “assume you will be compromised” is a self-fulfilling prophecy. If you believe you will be compromised then you will make investments in detection and remediation, instead of considering more effective forms of endpoint protection. It is a vicious cycle: assume compromise, invest in detection, compromise occurs because of inadequate protection, threats are detected, incorrect beliefs are validated, repeat into the next budget cycle.

Defence in depth

As a result, organizations believe that deploying a multitude of security solutions enables “Defence in Depth.” Bromium Labs has taken to calling this “Layers on Layers” because LOL makes hackers “laugh out loud.” It is important to note that each layer has its own set of limitations and if these limitations are shared across layers, then the number of layers doesn’t matter anymore. In the recent example from Bromium Labs, the focus was exploiting the kernel as that was the common weak link across all the widely used endpoint legacy technologies.

Common endpoint IT security solutions focus on sandboxes, antivirus (AV), host-based intrusion prevention systems (HIPS), exploit mitigation (EMET), and hardware-assisted security (SMEP), yet a single public exploit for a Windows kernel vulnerability bypasses all of these solutions, even if they are stacked one upon another.

This highlights the weakness of a “defence in depth” architecture. The simultaneous deployment of multiple solutions sharing the same weakness is not satisfactory. The issue is far from theoretical. Modern malware (e.g. TDL4) is already using this particular exploit to gain privileges. Windows kernel vulnerabilities are frequent, and this is not going to change any time soon – we have to live with them and be able to defend against them.

Sophisticated attacks present a significant hurdle for endpoint protection. Sophisticated attacks may incorporate malicious Web sites, email or documents that have been developed to evade detection. Therefore, even diligent security teams may not be alerted to a compromise. This is the shortcoming when you “assume compromise.”

Emerging trends

Additionally, emerging technology trends, such as cloud computing and mobile employees are relocating corporate assets beyond the corporate perimeter, increasing the need for effective endpoint protection. When a mobile user connects to an untrusted network, it is imperative that attacks don’t slip through the cracks.

Beyond the sophistication of attacks, there is also a balance between security and operations. Primarily, operations is concerned with ensuring that applications run, while security is concerned with compensating for vulnerable technology. For example, an organization may have developed its own legacy application that uses outdated and unpatched versions of Java to run.

Therefore, an effective endpoint protection solution must be able to securely enable both legacy application and new computing models from sophisticated new attacks without breaking them. Protection is not enough if we are not also maintaining a great user experience.

The reason it seems like endpoint security is a losing battle is because the current IT security model is broken. For example, the NIST Cybersecurity Framework organizes five basic cybersecurity functions: identify, protect, detect, respond and recover. Three-fifths of this framework (detect, respond and recover) assume compromise will occur. Similarly, industry analysts promote an advanced threat protection model of prevention, detection and remediation.

For the past two decades, threat detection has been a Band-Aid on a bullet wound. The good news is that it seems the security industry is finally starting to realize that reactive solutions, such as anti-virus, are incapable of detecting and protecting against unknown threats. Even Symantec has admitted that anti-virus is dead.

Threat detection

Threat detection systems rely on signatures to catch cyber-attacks, but the more signatures an organization has enabled, the more performance takes a hit. Organizations face a dilemma, balancing performance and security, which typically results in partial coverage as some signatures are disabled to maintain performance.

In order to stay ahead of unknown threats, organizations must adopt an architectural model that is proactive. For example, micro-virtualization delivers hardware-isolation, which isolates users tasks from one another, which in turn protects the system from any attempted changes made by malware.

A robust endpoint protection solution should address the hurdles we discussed earlier, securely enabling legacy applications and new technology initiatives from sophisticated new attacks. We can conclude that detection has failed because it is a reactive defence that attackers have learned to evade. Ironically, these reactive defences, such as signature-based detection, actually require quite a lot of activity with its constant updates and new signatures.

Endpoint protection

Instead, we should be considering endpoint protection solutions that are passive and proactive. One example is to deploy hardware-isolated micro-virtualization, which provides a secure, isolated container for each task a user performs on an untrusted network or document. Micro-virtualization can protect against known and unknown threats without the need for constant signatures and updates. This approach to containerization on the endpoint also enables superior introspection with real-time threat intelligence, which can provide insight into attempted attacks that can be fed into other security solutions.

Finally, endpoint protection must maintain end-user productivity. It cannot negatively impact performance or the user experience or else users will work to circumvent its protection. Ideally, the best solutions are invisible and autonomous to end users. They are non-intrusive, they do not negatively impact workflows and they avoid frequent updates.

2 Comments

5 Tips for a Successful Company, Product Launch

Comment

5 Tips for a Successful Company, Product Launch

Originally published on PR News on December 26, 2012

A new company or product launch is arguably the single most important activity in the PR lifecycle. Certainly, it is important to maintain consistent coverage post-launch, but it is the initial launch that catalyzes media interest and generates momentum for post-launch follow up.

Working with LEWIS PR, in the past year I have managed multiple successful company launches, including Metacloud and BlazeMeter among others, generating coverage with publications including TechCrunch, GigaOm, Wired and Forbes, among literally dozens more.

Every launch has its own idiosyncrasies, but having successfully launched so many companies, it becomes apparent that every launch has a lot in common too. Recently, I sat down to codify some of the best practices I’ve learned from my experience to share with the account teams at LEWIS PR. Today, I extend these best practices to you.

  1.  Plan ahead to get ahead—It has been said an ounce of prevention saves a pound of pain. The same is true for a successful public relations launch. From inception, it is imperative to understand your timeline, goals, audience, messages and spokesperson. Investing time up front into planning will insure you aren’t scrambling at the last minute. Speaking of scrambling at the last minute: avoid last minute changes at all costs; all they do is introduce errors into the process.

  2. Time is of the essence—Do you have two months to conduct your launch or do you have two weeks? It makes a big difference. In an ideal world, you will have two months to refine your message, identify key influencers, secure third-party validation and schedule briefings. In reality, you may have to cram two months of work into two weeks. In either case, creating a working PR choreography to trackback against will keep things organized and moving forward.

  3. The big picture is made of many small points—In order to develop effective messaging, we must use only that which works and take it from any place we can find it. Schedule calls with key stakeholders to identify what matters most to them, but also rely on your own industry expertise and resources such as LinkedIn profiles, corporate blog posts and company collateral, including product sheets and white papers to identify what will matter most to reporters.

  4. LMGTFY (Let me Google that for you)—To identify analyst and media influencers, utilize the key messages you have developed to mine proprietary services such as Cision, IT Database and Meltwater News to determine which outlets and individuals are most aligned with your launch. You can use the site-specific search capabilities of Google to search as follows:

    • “Your key message” site: www.targetwebsiteURL.com.

    • For example, a search for “’successful launch’ site: www.prnews.com” should return a result with this article.

     

  5. Pitch Writing Can Be a Challenge/Solution—Scalable, automated and highly available solutions are a dime a dozen. A successful pitch may include these ubiquitous buzzwords, but a more effective approach is to speak in real-world terms. What is the challenge businesses face today and how does your client solve it? Challenge-solution is a proven method of generating demand because the mind naturally seeks to fill gaps. Create a gap by presenting a challenge and your audience is more receptive to fill it with your solution.

Launching a new company or product takes a lot of dedication and finesse. Without previous experience, engaging one of these projects is like setting off to explore an uncharted realm, but these tips should provide a map. Good luck navigating the path to launch. 

Comment

Social Currency: The Revolution Has Been Tweeted

1 Comment

Social Currency: The Revolution Has Been Tweeted

Originally published on PR News on April 27, 2011

It’s hard to believe Malcom Gladwell, the critically acclaimed author of The Tipping Point, could fail to recognize the revolutionary power of social media. In The Tipping Point Gladwell argues that “ideas and products and messages and behaviors spread like viruses do,” yet in his October 2010 New Yorker article, “Small Change: Why The Revolution Will Not Be Tweeted,” he writes that social media “activism succeeds not by motivating people to make a real sacrifice, but by motivating them to do the things that people do when they are not motivated enough to make a real sacrifice.”

Tell that to the Egyptian revolutionaries. In February 2011, activists revolting against Egyptian President Hosni Mubarak utilized Facebook, Twitter and YouTube to organize. Not to slight their plight by making it about social media, but many images emerged from Egypt showing protesters holding signs referencing Facebook and graffiti emblazoned with Twitter. Social media serves as a platform that enabled “messages and behaviors to spread like viruses do."

Gladwell’s supposition continues, “Facebook and the like are tools for building networks, which are the opposite, in structure and character, of hierarchies. Unlike hierarchies, with their rules and procedures, networks aren’t controlled by a single central authority.” Yet in December 2010, Richard Stallman wrote, “The Anonymous Web protests over WikiLeaks are the Internet equivalent of a mass demonstration,” which were organized by a few highly influential Twitter accounts. The sacrifice required was quite real; at least five arrests and 40 search warrants were executed following the protest.

With hindsight, it’s easy to say that Gladwell is mistaken about the ability of social media to transform the world. Even so, he is still right about a couple points: Social media is useful for building networks and for motivating people. What can we learn from this to make sure we don’t miss the next revolution in our industry?

Social Media Is a Spotlight – Organizations can leverage compelling Facebook fan pages to generate Facebook likes, which build exposure within increasing social circles. Likewise, dedicated Twitter accounts can be coupled with unique hashtags to drive awareness to a particular topic, which continue to grow as posts are retweeted and responses are posted. One LEWIS Pulse client, McAfee, utilized this approach, combining the handle @McAfeeBusiness with the hashtag #SecChat to engage the community in an open conversation focused on security. As a result, hundreds of security practitioners and influencers have joined the discussion, exposing it to thousands upon thousands of followers.

Social Media Is a Microphone – Egyptian revolutionaries leveraged social networking, not just to organize “weak ties,” as Gladwell would say, but also to rally action from their followers. Organizations that distribute content multiple times a day improve brand loyalty, extend reach into related and competitive markets and reinforce credibility. This can be daunting to organizations that are comfortable with a one-way communication model, but Virgin chairman Richard Branson, Google chairman Eric Schmidt and Tony Hsieh, CEO of Zappos, all have their own Twitter feeds. You’ll be in good company.

Social Media Is a Barometer – In addition to organizing and motivating groups, social media presents the opportunity to listen to what is being said about you and your industry. There is a saying in construction, “measure twice, cut once,” which is equally applicable to metrics. Competitive analysis can provide insight into how your company compares to competitors and fits into the market, negative sentiment can be identified and re-mediated and positive sentiment can be reinforced.

Social media has been a revolution unto itself; with hundreds of millions of users already on Facebook, Twitter and YouTube, the revolution is quickly becoming the institution. Social media is enabling open conversations that not even dictators can prevent. The good news is that most of us aren’t working for tyrannical organizations, and as long as we learn how to embrace social media, we can join the revolution too.

1 Comment

Integrating Blogging and Media Relations: A Roadmap For Success

2 Comments

Integrating Blogging and Media Relations: A Roadmap For Success

Originally published on PR News on February 10, 2011

In December 2010, WikiLeaks released hundreds of classified diplomatic cables, triggering a cascade of effects that included hacker activists or 'hacktivists' launching cyber-attacks against Amazon and MasterCard in retaliation for canceled services to WikiLeaks.

An information security company, Imperva, tracked and reported the hacktivists’ methods on their blog. Our PR agency, Page One, provided this information to the press, generating dozens of clips for Imperva with major publications including The New York TimesUSA Today and PC World. This route to generating mass appeal intersects traditional media relations and new media strategies. By exhibiting thought leadership, generating content like a publisher and understanding who is consuming it, companies can leverage blogging programs to support outreach to the press.

Imperva has repeated its success with multiple reports, from analyzing data breaches to revealing hacker forums with compromised government and military Web sites for sale. Links to Imperva’s blog appear all over the Internet. While the subjects change, the strategies and tactics do not. Here are a few lessons learned from our work together:

Practice Thought Leadership

No one wants to read a self-promotional blog, so avoid grandstanding. Blog topics should associate industry trends to your business. Imperva is in the business of data security, so their blog focuses on data breaches, hackers and emerging threats. Page One has supported these efforts by identifying breaking news for Imperva to track. Other companies can do the same by learning to relate their expertise to what’s happening in their industry. For example, a company that sells paint would be advised to keep abreast with trends in construction materials.

Think Like a Publisher

In the 24-hour news cycle, reporters rarely have the time to schedule and connect with expert sources on breaking stories. By developing blog content that is informative, timely and easily digestible, PR professionals can circumvent the briefing process to provide valuable information to media in minutes instead of hours. We often direct reporters to visit Imperva’s blog for unique information. As a result, reporters now frequently visit Imperva’s blog without prompting and link to it in their stories. Charts, statistics, surveys, analysis and visual aides are all great examples of this content.

Know Your Audience

Take the time to understand what reporters find interesting and connect them with information they desire. While Imperva was tracking WikiLeaks hacktivists, Page One was tracking the reporters writing about WikiLeaks. This enabled our team to develop new media relationships, which we continue to engage.

Public interest is fickle, so there are no guaranteed blogging success, but companies that follow these new paradigms and repeat them time after time will be paving the path for reporters to come to their blog, consume their content and disseminate it to the world.

2 Comments