Friday, July 27, 2012

eDiscovery 2012 – Where We Have Been and Where We are Going – A Look Back At the First Half of the Year and Predictions For the Last Half of the Year

2012 has been an active and interesting one on the eDiscovery front thus far.  What follows are a few trends from the first part of the year and some predictions for the remainder of 2012 and beyond.

Where the eDiscovery Industry has been Over the First Part of 2012

1.  Predictive Coding – It is all the rage and this year’s hot topic in eDiscovery.  Will it revolutionize the industry and document review in particular?  Possibly.   Is it going away anytime soon?  Nope.  As an eDiscovery practitioner, do you need to know about it?  You bet you do.  The first part of 2012 has witnessed all of the major platform providers rushing to integrate this technology into their product, and some will ultimately have better products and be more successful than others are.  Remember, not all so-called predictive coding tools and technologies are created equal.  So, while the trend is to offer predictive coding, time and customer satisfaction will sort out who offers the best product for the right price.  Regardless of which company or companies win this battle, quality predictive coding products are starting to be, and in the future certainly will be, major players in the field for years to come.

2. Da Silva Moore – Any discussion of predictive coding in 2012 would not be complete without a mention of the Da Silva Moore case.  The most highly discussed and scrutinized eDiscovery case in years, Da Silva Moore once focused on judicial approval of predictive coding but quickly denigrated into a motion battle focused on Judge Peck’s actions rather than the merits and proper use of predictive coding.  Nevertheless, the case has brought tons of publicity to predictive coding, and may yet have a larger impact on the technology, as the case, and all the acrimony, churn slowly on without a definitive resolution to the predictive coding aspects.

3.  Spoliation and Proportionality – These topics have played second fiddle to predictive coding this year, but case law indicates that courts are considering these principals more and more and they are holding litigants to tighter standards.  No longer can clients or their attorneys get away with claims of being unaware or ignorant when it comes to spoliation.  Likewise, litigants are becoming bolder in challenging requests for large amounts of data, and judges are agreeing to limit requests in greater frequency.  Furthermore, it is a proportionality argument that lies at the heart of predictive coding’s value and reason for use; given the ever-expanding amount of data in the world, it is no longer proportional to review every document without the aid of technology and technology assisted review, such as predictive coding.

4.  Consolidation – The software products used in the eDiscovery field and the companies that create them are in an arms race to see who can add the most functionality to their product across the EDRM spectrum.  This creates one-stop shop products, but may also drive niche, one function, products out of the market and raise prices.  Additionally, although the products may do it all, they may not do it all well.  Similarly, law firms are challenging eDiscovery vendors by creating their own eDiscovery practice groups and bringing the latest technology in house in an effort to bring those billable hours back into the firm, but at what cost to clients?

5. Model Orders, State Rules, and Pilot Programs Oh My   Since late 2011, there have been a plethora of eDiscovery related standards, rules, model orders, and programs unveiled by different entities around the country, including: the U.S. Court of Appeals for the Federal Circuit, the U.S. District Court for the Southern District of New York, the U.S. District Court for the Eastern District of Texas, the U.S. District Court for the District of Delaware, the State of Pennsylvania, and the State of Florida.  Additionally, the Seventh Circuit recently concluded phase two of its Pilot Program on eDiscovery.   These various efforts are driven by a desire to standardize procedures and practices to contain eDiscovery costs and avoid unnecessary delays and disagreements.  Some will have greater longevity than others will, but they are all evidence of a growing judicial and administrative recognition of the impact eDiscovery is having on our legal system and the need to do something to improve the situation.  Likewise, the diversity of solutions offered is evidence of eDiscovery’s complexity and the lack of consensus regarding how to approach and manage it.

Where will the eDiscovery Industry Go Over the Next Six Months and Beyond

1.  Da Silva Moore – The Da Silva Moore case will continue to dominate the eDiscovery headlines, both as theater, and eventually as precedent (even if unofficial).  This is by far the highest profile predictive coding case that exists and everyone in the eDiscovery industry is waiting to see how it turns out.  Given its high profile, there will undoubtedly be much analysis and commentary on the outcome of the predictive coding battle and the case itself.  Hopefully, the scrutiny will shed some light on the cost, accuracy, and efficiency of predictive coding in a real case using real data.  If that does in fact occur, that will be lasting legacy of Da Silva Moore on the eDiscovery world, one that is much nobler and of higher value than the soap opera it currently perceived as.

2.  The Cream Will Rise to the Top – Certifications, conferences, and eDiscovery education providers will continue to vie for prestige, patronage, and above all your long-term support.  Over the past few years, we have seen numerous eDiscovery organizations and conferences spring up, including, among others, ACEDS and its annual conference, the Carmel Valley eDiscovery Retreat, and the Electronic Discovery Institute’s EDI Leadership Summit.  At times, these events have directly competed against each other and the various organizations and conferences that already exist.  At the same time, longstanding original players like Sedona and EDRM are looking at their purpose and goals and deciding on what and how they should focus their energy in the future to remain relevant and influential.  The eDiscovery conference market has reached a point of saturation, with people in the industry only willing to attend so many events a year and recognizing that there are only so many relevant panel topics.  From a participant’s perspective, why would you spend thousands of dollars to attend a conference that has four to five panels on the same topic (which topic by the way is also discussed at every other industry conference)?  From a vendor’s perspective, why would you spend thousands of dollars for an exhibit at a conference that is primarily attended by other vendors?  These competing organizations and conferences must find ways to differentiate themselves and provide a unique value proposition or the market may force them out.

3.  Smart Phones, Tablets, and Social Media are Game Changers - More and more I am hearing how e-mail will soon be replaced as a communication medium by methods such as texting and tweeting among others.  While I am not ready to declare e-mail dead (or even dying), there is no doubt that data created by non-traditional devices and/or in non-traditional sources (such as smart phones and tablets and on social media sites) will continue to proliferate both in data volume as well as in potential collection sources.  New niche industries and players (X1 is an example) will develop to preserve and collect this data in an accurate and useable format, and practitioners will need to adapt and fit this data and this new technology into their processes and workflows.  Individual social media sites and companies may disappear, as may technology brands and models, but the mobile social media lifestyle itself, and the challenges it poses for eDiscovery will not disappear.  The eDiscovery industry needs to catch up as quickly as it can.

What exactly the next big thing or big case in eDiscovery will be is difficult to predict, but regardless, the eDiscovery industry has been, is, and will continue to be an interesting, evolving, fast pace industry that is one to keep an eye on.

Tuesday, June 12, 2012

Gartner Releases 2012 “Magic Quadrant for E-Discovery Software”

Gartner recently released its now yearly report “Magic Quadrant for E-Discovery Software.”  The report analyzes the biggest names in the eDiscovery software field and categorizes them into one of four groups: Leaders, Challengers, Visionaries, or Niche players.  The report focuses heavily on consolidation within the industry as well as the EDRM lifecycle, placing a high value on companies and software that service the entire EDRM lifecycle.

The writers designated six companies as leaders:

- AccessData
- Autonomy
      - Guidance Software
      - Recommind
      - Symantec (Includes Clearwell)
      - ZyLAB

To be leader the company had to offer functionality that covers the complete EDRM lifecycle.  Additionally, offering predictive coding technology was an important positive factor in this analysis.

Some changes from the 2011 report include the exclusion of Epiq and IPRO because they no longer met at least one criteria for inclusion in the Magic Quadrant, the inclusion of KPMG and UBIC in the Magic Quadrant, and the change in status for FTI and kCura from leaders to challengers. 

kCura and FTI were no longer considered leaders because both focus on the right hand side of the EDRM only, rather than focusing on the complete model.  This fact emphasizes how much weight the Gartner writers placed on servicing the EDRM lifecycle.  To be clear, the report noted that kCura’s Relativity product is still a best in class product.  It also spoke very highly of FTI noting “[t]he company performs well all over the world, whereas others in its class do not necessarily have the presence or ‘bench strength’ to cover the globe, which is what many corporations need.”  Nevertheless, it likewise noted that many vendors are responding to the market with “broader end-to-end” functionality.

I agree with the report that the industry is moving toward greater consolidation and products that do it all, and I have written about that movement on this blog ( )  However, I believe that Gartner placed too much emphasis on this factor by making it a requirement to be a leader in the Magic Quadrant.  Certainly, one-stop products and companies that do it all offer convenience, and perhaps cost savings, and can absolutely be the best choice for you and your companies.  Likewise, I continue to think that more and more products will move in that direction.  However, at this point in time, choosing a product that does it all means sacrificing quality and functionality for convenience; products and companies that service the entire EDRM lifecycle may be competent at each area, but they are not going to be the best at each area.  Depending on your situation, choosing multiple products that are the best product available for each task may be a better option.  You should ask yourself, do you want one product that does everything, but only one of those things really well, or do you want three or four products that are all the best at what they do?  There is no one answer, but it is something to consider, and this will remain a choice you have to make until there is one product that is the best at everything, which could take a while.

Although the Gartner report is subjective and by no means does it analyze every product or company in the industry, overall, the creators did a good job and the report provides some interesting information and analysis.  The report concludes that the eDiscovery software industry will remain relevant while becoming more competitive, and that consolidation and the proliferation of one-stop shops and products will continue.  This prediction is spot on.

Sunday, May 20, 2012

Contract Attorneys – The Latest Addition to the Endangered Species List

Last week I read an article on titled “Does Predictive Coding Spell Doom for Entry-Level Associates?”  The article was prompted in part by the attention predictive coding is currently receiving as the de jure eDiscovery topic and the starring role it has played in the increasingly soap opera like Da Silva Moore case.  The article concluded that entry-level associates were still necessary and vital assets, even with the rise of predictive coding. 

I agree with the article’s conclusion, and am happy for the associates, but what about their less well placed colleagues, contract attorneys?   The threat for survival that contract attorneys face comes not just from predictive coding but from law schools that spill new graduates like a broken faucet, as well as from employers that take advantage of the situation by offering unscrupulously low wages knowing that for every position they have, there are several applicants willing to fill it at almost any rate or cost.  So, is there still a place for contract attorneys?  Will predictive coding and the deluge of law school graduates wipe out their positions, or depress their value to the point where the attorneys would make more money working at McDonalds?  I hope the answer is no, and the answer should be no if the legal community takes a moment to realize they need to treat contract attorneys  like the nonfungible assets they can be, rather than as pariahs who are undeserving of earning even $20 an hour. 
Despite their persona non grata reputation, a quality contract attorney is worth their weight in gold, and the legal industry should do everything it can to ensure they do not go the way of the dodo, whether because of technology, wages, or anything else.  Contract attorneys’ hands-on expertise and knowledge of review platforms and software can add great efficiency and effectiveness to a project.  Their in-depth familiarity with the documents and details of a case can be illuminating, and their understanding of the eDiscovery process can be a difference maker.  The truly good contract attorneys are knowledgeable experts that can be leveraged to your advantage and provide valuable input and consultation to your case and how you prepare for it.  More than hired mercenaries whose goal it is to plow through data as quickly as possible, contract attorneys can be your eyes and ears in the data.
At the end of the day, you get what you pay for, and nowhere is that more true than with contract attorneys.  You may be able to fill positions offering wages as low as $15 an hour, but that will not get you much more than a warm body.  With such a low rate of pay, a contract attorney will have every incentive to look everywhere and anywhere for a different job.  They will lack quality, consistency, motivation, and loyalty, resulting in a poor quality review, even if cheap.
Alternatively, as with most positions in life, the more faith and responsibility you show contract attorneys (along with paying them a decent wage for someone with a law degree) the more you obtain from them and the more value they will add to your case.  I urge you to look beyond the mere number efficiencies technology such as predictive coding can provide, to look beyond the hourly rate you are paying, and to focus instead on the intangible values added to your overall case.  That is where you find the true value and worth of your contract attorneys, and where you will find, if utilized properly, the good ones are invaluable and indispensible.  Do not get me wrong, I am not suggesting that you should forgo the use technology or that you should be offering your contract attorneys partner level compensation.  I am simply saying that technology should be used to supplement and enhance your contract attorneys’ value and capabilities, not replace them. 
Despite advances in technology, the human element of eDiscovery remains more vital and important than ever.  A key component of this human element is the contract attorney.  Even with the advance of predictive coding and like technologies, skilled contract attorneys should continue to be valuable commodities undeserving of a place on any endangered list.

Friday, May 11, 2012

Native Redactions – An Emerging Trend

It is a commonly accepted practice within the eDiscovery industry to image documents for production.  Likewise, it is now a commonly accepted practice, and indeed even a preferred practice, to exempt spreadsheets (and some other file types) from that requirement, instead producing those documents natively.  The idea being that parties would rather obtain native spreadsheets allowing them to work with and view the content in a meaningful manner rather than receive spreadsheet images that can be useless, cumbersome, or exceedingly difficult to accurately use and understand.  There is a nascent trend of not only producing spreadsheets in native format, but redacting them in native format as well (the concept has existed for years but is becoming an increasing point of emphasis as of late).  

The inherent nature of a spreadsheet means that it often contains complex data located in multiple rows, columns, and tabs. The data often includes or involves the use of formulas, sorting, or filtering amongst other features.  Macros, pivot tables, and hidden content add to the complexity.  If printed, the data often falls across multiple pages in a less than complete and less than orderly manner resulting in a confusing mess that is difficult to cobble together, let alone read and use.  The fact of the matter is that images simply are unable to capture the complexities many spreadsheets contain, so if the document and its content are to be useful and meaningful, you must produce them natively.  Most litigants now recognize this and are comfortable with, and often require, the native production of spreadsheets.  Yet, traditionally they have been less than enthusiastic about redacting spreadsheets in native format. 
Given that it is an accepted practice to produce spreadsheets natively, because that is how they will be most useful, why should redactions change that?  The answer is that it should not, and more and more practitioners are beginning to realize this.  Redacting changes the data in the spreadsheet, but it does not change the nature of the spreadsheet, the functionality of it, or how one uses the spreadsheet.  If a spreadsheets needs to be produced natively to be useful in its non-redacted original state, then logically it should be produced natively to be useful in a redacted state.
Anecdotally speaking, as time goes on, I am seeing much more acceptance and understanding of the native redaction practice across the industry.  My colleagues are telling me that they are seeing the same thing.   I am confident that it is only a matter of time before redacting spreadsheets in native format is the norm and an accepted standard and practice by courts and litigants alike; native redactions simply make the most sense for spreadsheets.
One of the hang-ups for those who are unfamiliar with native redactions lays in the subconscious or gut feeling associated with making redactions to a native document.  Redacting (i.e. deleting) content from native format documents that you are producing somehow feels inherently wrong, as if there is somehow a difference between covering up the data in an image redaction and deleting it in a native redaction.  In reality, and despite this feeling, if done properly, there is no meaningful difference between image and native redactions, or between covering up and deleting.  With each method, you are hiding data in an attempt to ensure the opposing party does not see it.  Whether the data is hidden beneath a box or darkened out area on an image, or deleted from a native document, the goal and result (hopefully) is the same: the data is not visible or searchable.  As long as you redact properly, and are open and honest with the opposing part about what type of redactions you are making, why, and how, there should be very little issue when redacting spreadsheets natively rather than via image.
Of course there are risks with native redactions, and native productions in general, including the loss of metadata, loss of formulas, changing dependencies (e.g. cell values based on formulas or the values in other cells) and the risk of manipulation by the opposing party to name a few.  However, there are methods and mechanisms for addressing these risks, and you can, and should, discuss them with your eDiscovery experts and the opposing party, before taking action.
However, from a strictly results perspective, if done properly there is no reason why the native redaction of spreadsheets should not be acceptable.  This argument carries even more weight if the parties are producing non-redacted spreadsheets natively; in that instance the parties identified value in producing non-redacting spreadsheets natively, and that same value would exist for redacted spreadsheets.  Driven by this logic and the comfort that will come as litigants gain familiarity with native redactions, more and more parties will turn to native redactions for documents like spreadsheets.  In the not so distant future, natively redacting spreadsheets will be a commonly accepted practice and standard in the eDiscovery industry.

Wednesday, April 25, 2012

Da Silva Moore, Global Aerospace, and Kleen Products – Hyped Triumvirate, But Dispositive Opinion Is Yet To Come

Three recent cases have taken the spotlight in the eDiscovery world, lauded as groundbreaking for their approval of predictive coding. This blog is no exception, having contributed to the commotion, particularly that surrounding Monique da Silva Moore, et. al. v. Publicis Group SA, et al.

In Da Silva Moore, the parties initially agreed to use predictive coding (although they never agreed to all of the details) and Magistrate Judge Peck allowed its use.  Plaintiffs have since attacked Judge Peck and most recently formally sought his recusal from the matter.  That request is currently pending.
Global Aerospace Inc., et al, v. Landow Aviation, L.P. dba Dulles, is the most recent case to address predictive coding, and it goes a step further than Da Silva Moore.  In Global Aerospace, the defendants wanted to use predictive coding themselves, but plaintiffs objected.  Virginia County Circuit Judge James H. Chamblin, ordered that Defendants could use predictive coding to review documents.  Like Da Silva Moore, the court did not impose the use of predictive coding, rather, the court allowed a party to use it upon request.
Kleen Prods., LLC v. Packaging Corp. of Am. goes the furthest, and is perhaps the most interesting of the three predictive coding cases because it is different than Da Silva Moore and Global Aerospace in one very important way: the plaintiffs in Kleen are asking the court to force the defendants to use predictive coding when defendants review their own material.  The court has yet to rule on the issue.
These three cases are in the spotlight because the use of predictive coding is seemingly at issue, and yet, in some ways, predictive coding is only marginally at issue.  Yes, in one sense the courts are ruling upon the technology itself and whether it is viable; if a court allows it to be used, it is implicit recognition that the technology works, at least enough to try it out and see how it goes.  However, these cases are really about who gets to choose the technology and method utilized.  These cases and disputes could exist with fact patterns where the parties are arguing over key word searches or linear review and the analysis would be much the same as they are now with predictive coding.  Can the parties pick and agree to a review method and technology?  In Da Silva Moore, Judge Peck said yes.   Can one party pick how they perform their review?  The Virginia court in Global Aerospace said yes.   Can one party force another party to use certain technologies and methods to perform their review?  The Kleen court has yet to rule on the issue. 
These questions are not new and novel and so far, the answers have not been so either.  Yes, they have allowed the parties to use predictive coding, but like with other technologies, the courts have taken a wait and see approach.  If the predictive coding technologies and/or processes used are unsuccessful in meeting obligations and needs, the courts appear more than willing to make adjustments, and perhaps embrace different technologies at that time; they are willing to give predictive coding a shot, but they are not betting the house on it either.
It is understandable why proponents of predictive coding are happy and view these cases as victories.  After all, these are the first opinions approving the technologies use, even if in a somewhat implicit manner.  However, the industry and the legal community must wait before drawing our final conclusions.  Only after a party has successfully used predictive coding in a case and survived a challenge of the results/end product (not just a challenge of its use) and it is captured in written opinion or order, will a true victory be won by predictive coding proponents.  Until then, predictive coding is still the equivalent of a highly rated draft pick; there is a lot of potential, and most people think it will succeed, including myself, but it still needs to prove itself in the trenches.  The predictive coding industry is bullish about its potential for success, and it may only be a matter of time until they are proved right, but only time will tell.

Monday, April 23, 2012

Plan on Planning – Help Your eDiscovery Personnel Help You

I had lunch with an eDiscovery colleague last week and he related to me a recent case he worked on. A few weeks ago, his client informed him that they had agreed with opposing party to make a production in four days. The client did not have a production population determined, and they had no idea how long it would take to create and run a production before they agreed to the deadline with opposing counsel; they picked a date in no way related to the reality that was their data set. None of this had an effect on their expectations for the viability of the project of course. The result? A rush project, extra people working extra hours to get the job done, tension, and having to renegotiate a new deadline with the opposing party because the date was simply unrealistic given the amount of data eventually involved. Ideal? No. Fun? No. Avoidable? Yes.

The above story exemplifies (although perhaps somewhat to the extreme) the experience eDiscovery personnel (whether in-house, outside counsel, or vendor) have with far too many clients in far too many cases. eDiscovery personnel are often left out of the decision-making process and have to scramble to meet artificially created deadlines that have little or no bearing to the work. We all have deadlines beyond our control, so eDiscovery personnel are no different than most in that regard, however, what can be exasperating for eDiscovery personnel, is that in the case of eDiscovery, the deadlines need not necessarily be so tight and out of our or control, or at least knowledge.

To avoid such rush projects, unobtainable deadlines, and wasted time and money, counsel should plan ahead for eDiscovery and include their eDiscovery personnel in that process, as well as in the negotiation of deadlines, to the extent possible (even if just as a point of reference and knowledge). Some easy things you can do to help your eDiscovery personnel better meet your needs, include:

• Create an eDiscovery Plan ASAP – Ideally you would create this before the case begins or soon thereafter. Be sure to include your eDiscovery personnel in this planning so that they can assist with properly setting eDiscovery related deadlines and expectations.

• Leverage Your eDiscovery Personnel’s Expertise – A classic example would be engaging them for search term analysis before agreeing to terms with the opposing party and before you make any productions. Provide the terms to your eDiscovery personnel for testing and sampling, leveraging their ability to write searches and manipulate review platforms. Via such exercises, they can sample documents testing for precision and recall, with the ultimate goal being to create a data set that is defensible and proportionate to the value of the case.

• Do Not Agree to eDiscovery Deadlines Before You Know What the Job Will Entail and Without Input from Your eDiscovery Personnel About How and If It Can Be DoneI-Med Pharma, Inc. v. Biomatrix, Civ. No 03-3677 (DRD), (D.N.J. 2011) is a great of example of why you need to know what the task entails before agreeing. The plaintiff’s in the matter agreed to search terms without testing them and without the advice of their eDiscovery personnel. The terms generated over 64 million hits and 95 million pages, unreal (and expensive) numbers.

• Build in Extra Time and Do Not Wait Until the Last Minute – The only thing worse than trying to complete a complex and important project precisely and accurately, is doing so with little notice and no time for mistakes. By engaging your eDiscovery personnel early in a matter, you not only put them on notice, but it will help them help you obtain the knowledge you need to negotiate and enter into reasonable deadlines and tasks with plenty of time.

You may be asking Why should I do all this, after all, are not my eDiscovery personnel paid to work for me? The answer is, aside from making your eDiscovery personnel happier and more motivated, it will also improve your case; you will have more time to do a better job and implement quality control measures, the court and opposing party will appreciate that you can deliver on what you promise, and by planning ahead, you can create cost saving efficiencies and avoid increased fees for rushed projects.

Friday, April 13, 2012

Consolidation of Services and Functionality: A Growing Trend in the eDiscovery Field. Will It Cost Customers in the Long Run?

Reed Smith, a US based international law firm, announced this week that they would be bringing Relativity in-house, continuing their expansion into the eDiscovery market (in 2011, the firm established a team dedicated to eDiscovery that has grown to over 50 lawyers). This marks a developing trend in the industry; many law firms are taking deliberate steps to ensure they keep eDiscovery work in-house and take back any business they may have lost to traditionally lower cost eDiscovery vendors and service providers. From the firms’ perspective, this makes sense; keep as much business inside the walls as possible, even if that means making capital expenditures.

Reed Smith indicated they would utilize Relativity primarily for document review, which is a relatively cheap (from the firms perspective) and easy way for the firms to make money off their clients. Money, that previously often went to eDiscovery vendors who offered superior technology. Many other firms are employing a similar strategy. With the acquisition of programs like Relativity, the vendors no longer clearly offer superior technology, and decisions about who performs the work become more contingent on relationships, which are often to a law firm’s advantage. In the long term, this is smart business for the firms.

From the clients’ perspective though, this could mean higher costs, as law firms traditionally charge higher rates than their eDiscovery vendor counterparts do. For the eDiscovery vendors, it obviously hurts them, as Firms like Reed Smith will take some, although not all, business that the vendors previously attracted because they had better technology and tools.

Software companies are likewise consolidating the functionality they provide either via development or via acquisition. One need not look any further than KCura’s Relativity for an example of the former, while Symantec ‘s acquisition of Clearwell is a clear example of the latter. Both KCura and Symantec offer products regarded as best in class, and both are aggressively expanding those products’ capabilities, sometimes at the expense of other less dynamic companies and products that not that long ago were consider must-haves in the eDiscovery world.

KCura is aggressively developing Relativity, once limited to review functionality, on both the front and backend of the review process. KCura is improving Relativity’s processing capabilities, adding the ability to ingest and process raw/native data, and creating tools like Fact Manager, which allows users to track and manage important facts, people, and documents within Relativity. The first improvement is a direct attack on programs like Law Prediscovery, while the later provides direct competition to CaseMap, a Lexis product.

By developing and adding these new functions, KCura has not only increased Relativity’s value and utility, but it is threatening formerly symbiotic products by poaching their customers; if you are going to use Relativity for a given matter, and especially if you are going to use it for multiple matters, it simply makes more sense to use the functionality built into Relativity and included in the price rather than license and pay for multiple products. While products like Law Prediscovery and CaseMap and will remain viable options for those not using Relativity, they will also begin to see their customer base shrink because of Relativity, which could mean difficult times ahead if they are unable to adapt quickly.

What all of this consolidation and expansion likely means is that it is going to be more difficult for the small and niche services and software providers to survive. eDiscovery shopping may become more convenient, as one-stop shops and applications become more common, but it may also become more expensive, as customers are forced to pay law firm prices and purchase programs that do everything and have a price tag to show for it.

Tuesday, April 3, 2012

Da Silva Moore Update: Judge Peck Responds to Plaintiffs' "Scorched Earth" Campaign

In the latest twist in the Da Silva Moore predictive coding case, Magistrate Judge Andrew J. Peck has responded to Plaintiffs' personal attack on him.

In a two page Order specifically addressing Plaintiffs' March 28, 2012 letter requesting Judge Peck's recusal, Judge Peck projects an aura of control, restraint, and is matter of fact in his statements. Judge Peck states:

"The Court notes that my favorable view of computer assisted review technology in general was well known to plaintiffs before I made any ruling in this case, and I have never endorsed Recommind's methodology or technology, nor received any reimbursement from Recommind for appearing at any conference that (apparently) they and other vendors sponsored, such as Legal Tech. I have had no discussions with Mr. Losey about this case, nor was I aware that he is working on the case. It appears that after plaintiffs' counsel and vendor represented to me that they agreed to the use of predictive coding, plaintiffs now claim that my public statements approving generally of computer assisted review make me biased. If plaintiffs were to prevail, it would serve to discourage judges (and for that matter attorneys) from speaking on educational panels about ediscovery (or any other subject for that matter). The Court suspects this will fall on deaf ears, but I strongly suggest that plaintiffs rethink their 'scorched earth' approach to this litigation."

Judge Peck’s response is in sharp contrast to Plaintiffs’ emotional and personal attach levied against the Judge. I applaud Judge Peck for taking the highroad and sticking to the facts, while still making his points. I am sure there will be more to come from this case and the eDiscovery News blog will keep you posted with any updates we become aware of.

To read the eDiscovery News blog’s original post about Plaintiff’s attack on Judge Peck please use the following link:
Monica Bay of Law Technology News wrote an interesting article summarizing some of the commentary of Plaintiffs’ attack. That article can be reached via the following link:

Tuesday, March 27, 2012

Privilege Waived: Insufficient Privilege Log Entries Doom Defendant’s Privilege Claims

Occasionally I run into clients or an opposing party who will attempt to use a metadata privilege log. And although this approach may save money (but even that point is debatable), I always advise against using them largely because, any such log will invariably be overbroad, inaccurate, incomplete, and potentially cause your opponent and the court to lose trust that you are being honest and straight forward with them. Most importantly, as the recent case ePlus, Inc. v. Lawson Software, Inc., No. 3:09cv620 (E.D. Va) makes clear, you risk waiving privilege.

Metadata privilege logs are created and populated using the metadata available within them, sometimes with little or no human input or modification. The results vary from case to case and even document to document. In some instances, a document may not contain data for some or all fields, or it may contain inaccurate or even non-intelligible information. As a result, the deliverable privilege log may have blank fields or incorrect data.

In ePlus, the Court analyzed several privilege log entries to determine if the defendant had waived privilege as a result of the entry itself. Basing its argument in part on Federal Rules of Civil Procedure Rule 26(b)(5) that a party asserting privilege must “’describe the nature of [privileged] documents, communications, or tangible things . . . and do so in a manner that, without revealing information itself privileged or protected, will enable other parties to assess the claim[]’” the Court held as follows:

1. As to log entries that do not contain author and recipient information, privilege was waived.
2. As to communications received by ten or more non-attorneys, the entries did not demonstrate that the non-attorneys had a “need to know” the information, and privilege was waived.
3. As to communications by non-attorneys that reflect legal advice, these documents are not privileged, since defendant did not “establish that these are communications from or to an attorney or that they are communications made at the direction of an attorney.”
4. As to entries missing date information or failing to assert privilege type, those are mere “minor deficienc[ies]” that do not prevent the opponent from attacking the privilege claim, so privilege was not waived.
5. The court also held that knowingly producing documents and later deciding they are privileged does not constitute “inadvertent” disclosure, and therefore, those documents cannot be clawed back; privilege has been waived for those documents.

The ePlus case should serve as a warning to those who think they can cut corners when creating privilege logs. While minimizing human involvement in privilege log creation and leaving fields blank may save you time and perhaps money, those same actions may waive privilege and make the whole privilege log exercise (even when shortened) superfluous. Defendants Lawson Software learned this the hard way. I hope that others will learn from their mistake; at the end of the day, the dollars saved by using a metadata log, or skimping on quality control of a human created log, simply are not worth the risk of waiving privilege and perhaps exposing the very core of your case.

Wednesday, March 21, 2012

Update – Plaintiffs Attack Judge Peck’s Da Silva Moore Predictive Coding Order Again

Perhaps no discovery order has been so widely covered (including by this blog) or deeply analyzed as Judge Peck’s order endorsing predictive coding in the Da Silva Moore case. In the latest turn, plaintiffs have filed a Reply in support of the objection to Judge Peck’s predictive coding ruling.

From the outset, there was a noticeable undertone of animosity towards Judge Peck running throughout the Reply. Plaintiffs took the opportunity to play up the connection between Judge Peck and defense counsel Ralph Losey (who is also regarded as a thought leader in the eDiscovery industry and is the author of a widely disseminated blog among other things), and to a lesser extent Recommind, the software vendor whose computer assisted review platform will potentially be used in this matter. Plaintiffs dedicate the first two pages (out of 14 total, and only 11 of which address the predictive coding dispute) to the recent professional relationship between Judge Peck and Mr. Losey, which has focused on the endorsement and discussion of predictive coding at various industry events around the country.

Asking “that the court reject MSL’s use of predictive coding and require the parties come up with a new ESI Protocol,” plaintiffs warn that “Judge Peck sets a dangerous precedent that is likely to deter future litigants from even considering predictive coding, lest they be bound by a protocol that contains no measure of reliability.” Obviously, counsel is trying to persuade the court here (which I can certainly appreciate), but I strongly disagree with this point. As I recently discussed in a blog titled: You Cannot Unring a Bell – Judge Peck’s Da Silva Moore Opinion Will Continue to Be Influential Despite Objection (, regardless of the outcome of this particular objection, predictive coding will continue to be a hot topic, and litigants will use it to the extent it makes fiscal sense and produces reasonable results.

Interestingly, plaintiffs cite Kleen Prods., LLC v. Packaging Corp. of Am., No. 10 C 5711 (N.D. Ill) in support of their arguments. Kleen is the case where plaintiffs have asked Magistrate Judge Nan R. Nolan to order defendants to use predictive coding. Plaintiffs in the Da Silva Moore matter hold Judge Nolan’s decision to require full briefing, expert reports, and evidentiary hearing on the use of predictive coding in high regard when contrasted with Judge Peck’s relatively quick process and decision. Arguing “in his rush to be the first in line to approve predictive coding, Judge Peck did not elicit expert testimony or give the parties an opportunity to question or cross-examine the experts.”

The outcome of the Da Silva Moore predictive coding dispute is now squarely in the hands of Judge Andrew Carter, and this blog will do its best to provide further updates as they arise.

As an aside, plaintiff also made the same argument, that this blog raised in its inaugural post: Peck and Choose (, regarding Judge Peck’s deference to French Privacy Law, stating that “Judge Peck failed to engage in the required comity analysis, under which the vast majority of U.S. precedents have found that French law does not preempt discovery.” It will be interesting to see what the outcome of this issue is as the matter proceeds.

Tuesday, March 20, 2012

Delta Lawyers Learn a Difficult Lesson: Court Levies Sanctions After Counsel's Reliance on IT Department

With discovery seemingly complete and in the books, the situation takes an unexpected turn when Defendant located new data in In re Delta/AirTran Baggage Fee Antitrust Litigation. The court sanctioned defendant, Delta Air Lines, for locating and producing 60,000 pages of responsive material after the close of discovery. Defendant discovered that they had inadvertently failed to search several hard drives and failed to locate several backup tapes prior to the close of discovery despite claiming nearly 20 times that they had produced all responsive material. Defendant’s counsel asserted they relied on assurances given by their IT department, and although the court did not find defendant’s omission to be intentional, it nevertheless found that defendant “did not conduct a reasonable inquiry” and levied monetary sanctions against defendant. However, given that defendant did cooperate once the error was reported, the court refrained from precluding defendant’s use of the material.

Judge Timothy C. Batten identified several key errors in how defendant dispatched its discovery obligations, including:

• Defendant’s counsel did not confirm with IT that each hard drive that was supposed to be loaded for searching actually had been; counsel did e-mail IT with a list of custodian hard drives that should have been loaded, but IT “did not respond with confirmation that each listed person’s drive was on the system[.]”
• Despite intense questioning and discussion of backup tapes with the court, counsel did not personally search the location the backup tapes were ultimately discovered; instead, counsel relied on IT’s statements regarding the contents of the location, and the absence of backup tapes in that location.
• Prior to discovering the misplaced data, defendant repeatedly (approximately 20 times) stated to the court and plaintiffs that they had produced everything.
• Defendant did not promptly inform the court of its misrepresentations; defendant waited nearly two weeks and until after the court ruled on a spoliation issue to inform the court of the newly discovered data.

To defendant’s credit, once the court and the plaintiffs were aware of the issue, defendant was very cooperative in working to produce the responsive data. Some other measures that Defendant, as a corporation that is confronted with numerous disputes and investigations over the course of a year, could have implemented to avoid such issues altogether, include the following:

• Create and follow a standardized process(es) for use in every matter. These processes should include the tracking of collections, processing, searching, review, and production of all data for a given matter.
• Implement and utilize a team dedicated to e-discovery who can work across matters serving as a single point of contact for the various players involved.
• Counsel cannot merely rely on the assertions made by their client; they must be actively involved and make reasonable efforts to verify statements are accurate and complete.

The Delta decision is a good example of how a party can fall victim to an ad hoc e-discovery approach rather than a well defined plan that is repeatable across matters. The most important takeaway for practitioners, is the need to be involved in all aspects of a matter and not simply rely on assertions made by your client; taking statements at face value without reasonable inquiry into their veracity, leaves one open to sanctions.

Friday, March 16, 2012

You Cannot Unring a Bell - Judge Peck's Da Silva Moore Opinion Will Continue to Be Influential Despite Objection

News recently broke noting that Magistrate Judge Andrew Peck's recent opinion addressing predictive coding is in jeopardy of becoming obsolete less than a month after causing shockwaves within the eDiscovery community.

It seems the plaintiffs in the Da Silva Moore matter took exception to the procedural and temporal irregularities surrounding the seminal opinion; Judge Peck issued the opinion after plaintiffs filed their objections, thereby depriving them of the opportunity to object to the opinion itself. Consequently, plaintiffs objected and sought the opportunity to respond to the opinion itself to which Judge Andrew Carter obliged.

While this new twist may be significant in some respects, it does not mean Judge Peck's opinion is now meaningless, and it most certainly does not signal the end of the predictive coding movement and trend; if anything, this will only serve to draw more attention to predictive coding, the de jour subject in eDiscovery at the moment. Indeed, plaintiffs’ objections focused more on procedural and process issues, rather than the efficacy or validity of predictive coding in general or as a whole.

Judge Peck is clearly an advocate of predictive coding, and in issuing his seminal opinion endorsing, or at least agreeing to the use of, predictive coding, he was perhaps a little eager to make his point. However, that fact does not greatly diminish the power of what he said, and does not close the predictive coding door that Judge Peck opened (the opinion made a large splash for a reason). Despite this minor setback, the push for predictive coding will continue to move forward and the industry will continue to look to the Peck Da Silva Moore decision as a watershed moment regardless of the outcome of this new twist - you cannot unring the bell.

Wednesday, March 7, 2012

Not So eDiscovery e-Discovery Rules

Good Morning All,

As of late, I have been taking a keen interest in the development and nascent proliferation of eDiscovery "pilot programs" and other similar efforts, usually by courts, to standardize eDiscovery. So, when I saw an article put out by Thomson Reuters entitled "New York Implements new mandatory e-discovery rules" I was quite intrigued. However, upon reading the article I was quite disappointed and even shocked.

It turns out the article is all about New York state courts mandating that documents and pleadings in certain cases be filed with the court electronically. While I certainly applaud the courts of New York for their effort to go paperless, and Thomson Reuters for covering the topic, this is not an article about eDiscovery, regardless of what the title says.

Thomson Reuters' confusion did drive home a point for me. In this age of eDiscovery certifications, judges endorsing computer-assisted coding, and social media, a large part of the legal community, and an even greater portion of the community at large, have no idea what eDiscovery is, let alone utilize it.

While Judge Peck and others like him press forward at the forefront of the field, many practitioners have never used an electronic document review platform, still insist on reviewing documents in hard copy, or cling to any number of other antiquated discovery methods. The fact that Thomson Reuters, a respected news agency, could so easily misuse the term simply emphasizes this point; while the field of eDiscovery is making great strides and pushing forward, there are still many who do not know the basics and need to be educated. If we can somehow educate the majority of legal practitioners about eDiscovery, that, in my mind, may be a far greater and broader reaching achievement than computer-assisted coding currently is.

How will this be achieved? My answer is through those of us in the eDiscovery industry working to educate and initiate others. It will take time, but I am confident it can be done. So, I encourage all of you to take on that task, and the next time one of your colleagues asks for all the documents to be printed and put in a bankers box, take a moment to let them in on the secret that is eDiscovery.

If you would like to read the entire Thomson Reuters article please use this link:

Tuesday, March 6, 2012

Peck and Choose

Hello Everyone and welcome to my new blog. My name is Brandon and I work for an eDiscovery service provider, i.e. a vendor. This blog is something that I will be contributing to outside of my role of employee and will reflect my personal opinions on various eDiscovery issues. So, I hope that this post and others in the future will, at a minimum, give you something to think about. Enjoy.

For my first post, I am writing about Magistrate Judge Peck's recent decision in the Moniqe Da Silva Moore v. Publicis Groupe case. This is the decision now made famous for Judge Peck's comments about predictive coding, and while those comments are important, and even groundbreaking, I am writing about the opinion for another reason or reasons: the numerous other eDiscovery issues that Judge Peck mentioned, but failed to discuss, and the potential consequences of those issues.

In the opinion, Judge Peck glosses over three other major eDiscovery issues that I feel deserve to be fleshed out, including:

1. Discovery and Data Collection in the EU is Not Guaranteed - Plaintiffs sought data that resided in France and Judge Peck, without discussion, ruled that data would not be included in the first phases of discovery because the data "likely would be covered by the French privacy blocking laws[.]" What is interesting is that the quote suggests Judge Peck may abstain from requiring this data be included in Discovery because of the French laws. Most Federal courts in most cases will "ignore" foreign privacy laws, essentially telling litigants that they are under the jurisdiction of the US Courts and US discovery rules will apply, so the litigants will have to figure out how to obtain the data or face the consequences of failing to do so. Although he did not definitively rule on the matter for future phases of discovery, this raises the question: Will this start a new trend, where Judge Peck (and potentially other Judges in the future) defer to foreign privacy laws and their impact on discoverable data?

2.FRCP Rule 26(g)(1)(A) Does Not Apply to Discovery Responses - Rule 26(g)(1)(A) of course states that a party must sign every disclosure, stating that it is "complete and correct as of the time it is made." Judge Peck states that this clause does not apply to discovery responses, but rather to initial disclosures. Judge Peck instructs that Rule 26(g)(1)(B) applies to discovery responses, and it enunciates proportionality principle rather than a completeness standard. Despite the fact that it is impossible to ensure completeness in discovery responses, litigants have traditionally asserted and required the "complete and correct as of time it is made" "guarantee" in conjunction with productions. Will Judge Peck's analysis start to erode the use and mandate of this "guarantee?"

3. The Decision to Embrace Computer-Assisted Review in This Case Was Easy - The parties in Da Silva Moore agreed to use computer-assisted review and Judge Peck simply agreed to allow them to do so. Judge Peck points out that the tougher question is that raised by Klein Prods. LLC v. Packaging Corp. of Am., where plaintiff's have asked the court to order defendant's to use computer-assisted review to respond to plaintiff's document requests. How Magistrate Judge Nan Nolan (who by the way is a Chair for the Seventh Circuit Electronic Discovery Pilot Program) rules on that issue may have far greater impact than Judge Peck's decision. Stay tuned.

While the above issues do not have the groundbreaking caché that computer-assisted review currently does, they are none-the-less relevant and important, and may likewise have a large impact on the eDiscovery landscape moving forward. I for one, hope that they get the attention they deserve in this case and others moving forward.

Below is a link to Judge Peck's decision as well as a link to the Seventh Circuit Electronic Discovery Pilot Program Home Page.

Judge Peck's Da Silva Moore Opinion:

Seventh Circuit Electronic Discovery Pilot Program Home Page: