TRW Automotive U.S. LLCv.Magna Electronics Inc.Download PDFPatent Trial and Appeal BoardJun 26, 201412640425 (P.T.A.B. Jun. 26, 2014) Copy Citation Trials@uspto.gov Paper 16 Tel: 571-272-7822 Entered: June 26, 2014 UNITED STATES PATENT AND TRADEMARK OFFICE _______________ BEFORE THE PATENT TRIAL AND APPEAL BOARD _______________ TRW AUTOMOTIVE US LLC, Petitioner, v. MAGNA ELECTRONICS INC., Patent Owner. _______________ Case IPR2014-00258 Patent 7,994,462 B2 _______________ Before JUSTIN T. ARBES, BENJAMIN D. M. WOOD, and NEIL T. POWELL, Administrative Patent Judges. WOOD, Administrative Patent Judge. DECISION Denying Institution of Inter Partes Review 37 C.F.R. § 42.108 IPR2014-00258 Patent 7,994,462 B2 2 I. INTRODUCTION A. Background TRW Automotive US LLC (“TRW”) filed a Petition (Paper 1, “Pet.”) requesting inter partes review of claims 1, 3, 5-13, 15-17, 19, 21, 23, 26, and 27 of U.S. Patent No. 7,994,462 B2 (Ex. 1002, “the ’462 patent”). Patent Owner Magna Electronics Inc. (“Magna”) filed a Preliminary Response (Paper 7, “Prelim. Resp.”). Institution of an inter partes review is authorized by statute when “the information presented in the petition filed under section 311 and any response filed under section 313 shows that there is a reasonable likelihood that the petitioner would prevail with respect to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a); see also 37 C.F.R. § 42.108. We determine that TRW has not shown a reasonable likelihood that it would prevail with respect to at least one of the claims of the ’462 patent, as set forth more fully below. Accordingly, we do not institute an inter partes review. B. Related Proceedings TRW discloses that the ’462 patent has been asserted in Magna Electronics, Inc. v. TRW Automotive Holdings Corp., Case 1:12-cv-00654- PLM (W.D. Mich. 2012). The ’462 patent is also the subject of another inter partes review, TRW Automotive US LLC v. Magna Electronics Inc., IPR2014-00266. C. The ’462 Patent (Ex. 1002) The ’462 patent, titled “Vehicular Image Sensing System,” describes a system for controlling a vehicle—e.g., dimming the vehicle’s headlights—in IPR2014-00258 Patent 7,994,462 B2 3 response to detecting “objects of interest” in front of the vehicle—e.g., the headlights of oncoming vehicles and the taillights of leading vehicles. Ex. 1002, 1:22-27. The system uses an image sensor that divides the scene in front of the vehicle into “a plurality of spatially separated sensing regions.” Id., 2:16-19. A control circuit with a processor receives image data from the image sensor and determines if individual regions include light sources having a particular characteristic, such as a “spectral characteristic” (color), or intensity. Id., 1:65-2:9, 3:43-51. By comparing the lights’ characteristics with the “distribution” of the lights across the regions, such as the lights’ proximity to each other and to the vehicle’s central axis, the system can distinguish oncoming headlights and leading taillights from streetlights and other lights that are not of interest. Id., 9:32-61, 10:53-56. The system also may detect traffic signs and lane markers, and assist the driver in other ways, such as alerting the driver to lane changes. Id., 11:60-12:13. D. Illustrative Claims Claims 1 and 23 are independent, and are drawn to an image sensing system for a vehicle. Ex. 1002, 12:57-13:15, 14:35-61. The independent claims share at least three common limitations: (1) an image sensor, comprising a two-dimensional array of light-sensing photosensor elements; (2) the image sensor being inside the vehicle on which it is mounted, having a forward field of view through the vehicle’s windshield; and (3) a control comprising a processor that processes the image data to identify objects of interest. Id., 12:57-67, 14:35-46. The dependent claims further specify the location of the image sensor inside the vehicle, the physical structure of the image sensor, the specific processing techniques that the processor uses to IPR2014-00258 Patent 7,994,462 B2 4 identify objects of interest, and the actions taken in response to identifying objects of interest. Id., 13:19-14:24, 15:1-10. Claims 1 and 23 are illustrative and are reproduced below: 1. An image sensing system for a vehicle, said image sensing system comprising: an imaging sensor comprising a two-dimensional array of light sensing photosensor elements; said imaging sensor having a forward field of view to the exterior of a vehicle equipped with said image sensing system and through the windshield of the equipped vehicle; wherein said imaging sensor is operable to capture frames of image data; a control comprising an image processor; wherein said image sensing system identifies objects in said forward field of view of said image sensor via processing of said captured image data by said image processor; wherein said image processing comprises pattern recognition and wherein said pattern recognition comprises detection of at least one of (a) a headlight, (b) a taillight and (c) an object, and wherein said pattern recognition is based at least in part on at least one of (i) shape, (ii) reflectivity, (iii) luminance and (iv) spectral characteristic; and wherein said pattern recognition is enhanced by comparing image data of objects over successive frames of said captured image data. 23. An image sensing system for a vehicle, said image sensing system comprising: an imaging sensor comprising a two-dimensional array of light sensing photosensor elements; wherein said imaging sensor is at or proximate to the in- cabin surface of the windshield of a vehicle equipped with said image sensing system, and wherein said imaging sensor is operable to capture image data; a control comprising an image processor; IPR2014-00258 Patent 7,994,462 B2 5 wherein said image sensing system identifies objects in said forward field of view of said image sensor via processing of said captured image data by said image processor; wherein said image processing comprises pattern recognition and wherein said pattern recognition comprises detection of at least one of (a) a headlight, (b) a taillight and (c) an object, and wherein said pattern recognition is based at least in part on at least one of (i) shape, (ii) reflectivity, (iii) luminance and (iv) spectral characteristic; and wherein objects are at least one of (a) qualified and (b) disqualified based, at least in part, on object motion in said field of view of said imaging sensor. E. Prior Art Relied Upon TRW relies upon the following prior art references: Kenue US 4,970,653 Nov. 13, 1990 Ex. 1004 Bottesch US 5,166,681 Nov. 24, 1992 Ex. 1006 Yanagawa et al. (“Yanagawa”) JP S62-121837 June 15, 1987 Ex. 1005 Tadashi 1 JP H04-127280 Apr. 28, 1992 Ex. 1011 Wilson-Jones et al. (“Wilson-Jones”) EP0640903A1 March 1, 1995 Ex. 1007 Oliver Vellacott, CMOS in Camera, IEE REVIEW (May 1994) (hereinafter “Vellacott”) (Ex. 1008); Yon-Jian Zheng, et al., An Adaptive System for Traffic Sign Recognition, INTELLIGENT VEHICLES ’94 SYMPOSIUM (Oct. 24-26, 1994) (hereinafter “Zheng”) (Ex. 1010); Mai Chen, AURORA: A Vision-Based Roadway Departure Warning System, 1995 IEEE/RSJ INT’L CONF. ON INTELLIGENT ROBOTS AND SYSTEMS (Aug. 9, 1995) (hereinafter “Aurora”) (Ex. 1009). 1 We refer to “Yanagawa” and “Tadashi” as the English translations of the original references. TRW provided affidavits attesting to the accuracy of the translations. See Exs. 1005, 1011; 37 C.F.R. § 42.63(b). IPR2014-00258 Patent 7,994,462 B2 6 F. Asserted Grounds of Unpatentability TRW contends that the challenged claims are unpatentable under 35 U.S.C. §§ 102 and/or 103 based on the following specific grounds (Pet. 4-5): Reference[s] Basis Claims Challenged Kenue § 102 1, 3, 5-7, 10, 15, 19, 23, 26, and 27 Yanagawa, Bottesch, and Wilson- Jones § 103 1, 3, 5-8, 10, 15, 19, 21, 23, 26, and 27 Aurora with either (a) Kenue; or (b) Yanagawa, Bottesch, and Wilson-Jones § 103 11 Zheng with either (a) Kenue; or (b) Yanagawa, Bottesch, and Wilson-Jones § 103 12 and 13 Vellacott with either (a) Kenue; or (b) Yanagawa, Bottesch, and Wilson-Jones § 103 16 and 17 Yanagawa, Bottesch, Wilson- Jones and Tadashi § 103 9 II. ANALYSIS A. Claim Construction Consistent with the statute and the legislative history of the AIA, the Board will interpret claims using the broadest reasonable construction. 37 C.F.R. § 100(b). Claim terms are given their ordinary and customary meaning as would be understood by one of ordinary skill in the art in the context of the entire disclosure. In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007). Any special definition for a claim term must be set forth in the specification with reasonable clarity, deliberateness, and IPR2014-00258 Patent 7,994,462 B2 7 precision. In re Paulsen, 30 F.3d 1475, 1480 (Fed. Cir. 1994). For purposes of this decision, we construe the following terms: (1) pattern recognition; and (2) wherein objects of interest are at least one of (a) qualified and (b) disqualified, at least in part, based on object motion in said field of view of said imaging sensor. 1. pattern recognition (claims 1 and 23) The Specification states that Pattern recognition may be used to further assist in the detection of headlights, taillights, and other objects of interest. Pattern recognition identifies objects of interest based upon their shape, reflectivity, luminance, and spectral characteristics. For example, the fact that headlights and taillights usually occur in pairs could be used to assist in qualifying or disqualifying objects as headlights and taillights. By looking for a triad pattern, including the center high-mounted stoplight required on the rear of vehicles, stoplight recognition can be enhanced. Ex. 1002, 11:1-10. TRW asserts that pattern recognition should be construed as “the identification of objects based upon their shape, reflectivity, luminance, and spectral characteristic.” Pet. 8. For purposes of this decision, we agree, with one caveat: that the terms “shape, reflectivity, luminance, and spectral characteristics” are to be understood as disjunctive rather than conjunctive. In other words, “pattern recognition” must be based on at least one of these characteristics, not necessarily all of them at the same time. This is indicated by the claims themselves. For example, claim 1 requires the image processing to comprise “pattern recognition . . . wherein said pattern recognition is based at least in part on at least one of (i) shape, (ii) reflectivity, (iii) luminance and (iv) spectral characteristic.” Ex. 1002, 13:5-11 (emphasis added). Thus, we construe “pattern recognition” to mean IPR2014-00258 Patent 7,994,462 B2 8 detection of an object of interest based upon shape, reflectivity, luminance, or spectral characteristic. 2. wherein objects are at least one of (a) qualified and (b) disqualified based, at least in part, on object motion in said field of view of said imaging sensor (claims 6, 7, and 23) Claim 23 recites the limitation “wherein objects are at least one of (a) qualified and (b) disqualified based, at least in part, on object motion in said field of view of said imaging sensor.” Similarly, claim 6 recites “wherein detected objects are qualified based, at least in part, on object motion in said field of view of said imaging sensor,” and claim 7 recites “wherein detected objects are disqualified based, at least in part, on object motion in said field of view of said imaging sensor.” Neither TRW nor Magna proposes a construction for these terms. The ’462 patent uses the word “qualifying” to mean determining that an object is a certain type of object, and uses the word “disqualifying” to mean determining that the object is not a certain type of object. For example, the ’462 patent discloses that “the fact that headlights and taillights usually occur in pairs could be used to assist in qualifying or disqualifying objects as headlights and taillights.” Ex. 1002, 11:5-7; see also id., 12:14-21 (“For example, a high level of ‘non-qualified’ light sources; namely, light sources that are not headlights or taillights, as well as ‘qualified’ light sources can be used to determine a measurement of the activity level around the vehicle; namely, that the vehicle is in an urban environment which may be a useful input for particular control algorithms.”). In light of this use of the terms “qualifying” and “disqualifying,” we construe the claim language “objects are at least one of (a) qualified and (b) disqualified” to mean that objects are at least one of IPR2014-00258 Patent 7,994,462 B2 9 determined to be objects of interest and determined to not be objects of interest. Accordingly, we construe the language “wherein objects are at least one of (a) qualified and (b) disqualified based, at least in part, on object motion in said field of view of said imaging sensor” as meaning that objects are at least one of determined to be objects of interest and determined to not be objects of interest based, at least in part, on object motion in said field of view of said imaging sensor. Similarly, we construe “wherein detected objects are qualified based, at least in part, on object motion in said field of view of said imaging sensor” in claim 6 as meaning that objects are determined to be objects of interest based, at least in part, on object motion in said field of view of said imaging sensor, and construe “wherein detected objects are disqualified based, at least in part, on object motion in said field of view of said imaging sensor” in claim 7 as meaning that objects are determined to not be objects of interest based, at least in part, on object motion in said field of view of said imaging sensor. B. Claims 1, 3, 5-7, 10, 15, 19, 23, 26, and 27—Anticipation— Kenue TRW asserts that Kenue anticipates claims 1, 3, 5-7, 10, 15, 19, 23, 26, and 27. Pet. 12-20, 42-48. Kenue describes a “computer vision system” that detects lane markers and obstacles in front of an automobile. Ex. 1004, 1:53-61. Kenue’s system comprises a black and white CCD [charge coupled device] video camera mounted on a vehicle’s windshield to capture the driver’s view of the road in front of the vehicle. Id., 2:28-32. A computer receives the digitized image data from the camera and, using two main algorithms, “dynamically define[s] the search area for lane markers based on IPR2014-00258 Patent 7,994,462 B2 10 the lane boundaries of the previous [image] frame, and provide[s] estimates of the position of missing markers on the basis of current frame and previous frame information.” Id., 2:32-48. The system also detects and alerts the driver to obstacles in the lane within about 50 feet of the vehicle. Id., 2:48-51. “A claim is anticipated only if each and every element as set forth in the claim is found, either expressly or inherently described, in a single prior art reference.” Verdegaal Bros. v. Union Oil Co. of California, 814 F.2d 628, 631 (Fed. Cir. 1987). For the reasons set forth below, we are not persuaded that TRW is reasonably likely to show that Kenue teaches all of the claim limitations of claims 1, 3, 5-7, 10, 15, 19, 23, 26, and 27. 1. Claim 23 Magna asserts that TRW failed to show that Kenue anticipates claim 23. Claim 23 requires that “objects of interest are at least one of (a) qualified and (b) disqualified based, at least in part, on object motion in said field of view of said imaging sensor” (hereinafter, the “qualified-disqualified limitation”). Ex. 1002, 14:35-61. According to TRW, the following portions of Kenue teach this limitation: Kenue discloses “[a]fter the centroid of each marker is calculated . . ., if the markers are not found where expected . . . based on the previously detected lane geometry, the marker locations from the previous frame are used . . . . The determination of expected marker position . . . involves comparing the position of each marker centroid with that of the previous frame. If the change is more than nine pixels, a flag is set. (1004-011 at 5:14-22) Kenue further states “FIG. 3 illustrates the presence of a vehicle 30 or other obstacle in the roadway of FIG. 2. If the other vehicle is close, say, within 50 feet of the trailing vehicle, it IPR2014-00258 Patent 7,994,462 B2 11 tends to obscure the lane markers 24 to such an extent that there is insufficient information to determine the boundaries. In that case an obstacle warning is given and no further image processing is done on that frame. When the obstacle 30 is more than 50 feet away the image is processed but the obstacle is effectively erased from the image by removing horizontal and vertical lines, thereby making the subsequent processing steps simpler.” (1009-101 at 3:10-21) Pet. 18-20. TRW’s declarant, Jeffrey A. Miller, Ph.D., further explains how these excerpts from Kenue correspond to the qualified-disqualified limitation. Referring to the first excerpt that discusses marker positions, Dr. Miller states, “[t]he above [is] qualifying an object of interest based on object movement because the processing determines if a marker position changes more than nine pixels from one frame and then sets a flag if the marker has changed accordingly.” Ex. 1012 ¶ 15-16. Regarding the second excerpt, Dr. Miller states, “[t]he above is disqualifying an object of interest because where an obstacle moves to more than 50 feet away, the obstacle is erased effectively from the image.” Id. Magna disagrees that Kenue teaches this limitation. Magna asserts that the quoted language is “significantly different from the claim language, such that a mere quotation of the language does not show anticipation of the claim.” Prelim. Resp. 15; see also Prelim. Resp. 16 (same). Magna also asserts that Dr. Miller’s testimony is insufficient to explain the alleged match. Id. at 16. We agree. As Magna states, the quoted language from Kenue does not, on its face, correspond to the qualified-disqualified limitation. For example, it is unclear from the first excerpt how lane markers are “qualified,” i.e., IPR2014-00258 Patent 7,994,462 B2 12 determined to be objects of interest. Rather, it would seem that Kenue’s system predetermines lane markers to be of interest, as it is specifically designed to detect them. Dr. Miller’s testimony in this regard is not helpful, because it assumes without discussion that the first excerpt describes “qualifying an object,” and merely explains why such qualification is based on “object movement.” Ex. 1012 ¶ 15. As for the second excerpt from Kenue on which TRW relies, it is unclear how it describes disqualifying an object, i.e., determining it to be not of interest, based on object motion. Instead, it appears to describe issuing a warning to the driver based on object position (within 50 feet of the vehicle), rather than object motion. Again, Dr. Miller’s testimony regarding this excerpt (id.) is not helpful, as it does little more than repeat the claim language. See, e.g., Ex. 1012 ¶¶ 12-13. For the above reasons, we determine that TRW is not reasonably likely to show that Kenue teaches the qualified-disqualified limitation, and therefore has not shown a reasonable likelihood of prevailing on its assertion that Kenue anticipates claim 23, as well as claims 26 and 27, which depend from claim 23. 2. Claim 1 Magna argues that TRW failed to show that Kenue anticipates independent claim 1. Prelim. Resp. 12-17. Specifically, Magna asserts that TRW failed to show that Kenue teaches the limitation in claim 1 “wherein said pattern recognition is enhanced by comparing image data of objects over successive frames of said captured image data.” Prelim. Resp. 16. According to Magna, “TRW cited to a portion of Kenue with different language than in the claim and does not explain why the cited portion allegedly anticipates the claim limitation, specifically how ‘pattern IPR2014-00258 Patent 7,994,462 B2 13 recognition is enhanced.’” Id. Magna also contends that Dr. Miller’s testimony does not provide any additional explanation because “it appears to be an exact copy of what is in the petition and contains no other analysis or discussion.” Id. (citing Ex. 1012 ¶¶ 12-13, 67-68). TRW relies on the following passages of Kenue as teaching this limitation: [The] algorithms . . . dynamically define the search area for lane markers based on the lane boundaries of the previous frame, and provide estimates of the position of missing markers on the basis of current frame and previous frame information. . . . Moreover, the search area changes with marker position MP, and increases in size if the marker was not found in the search of the previous frame. . . . After the centroid of each marker is calculated . . . if the markers are not found where expected . . . based on the previously detected lane geometry, the marker locations from the previous frame are used. Pet. 14-16 (quoting Ex. 1004, 2:44-48, 4:37-39, 5:15-18). We agree with Magna that neither TRW nor Dr. Miller explains sufficiently how these passages correspond to the limitation at issue. Kenue’s teaching regarding comparing a “previous frame” with a “current frame” is suggestive of “comparing image data of objects over successive frames of said captured image data,” but it is not self-evident how the quoted passages relate to “enhanc[ing] pattern recognition,” based on “at least one of shape and luminance.” Pet. 14. Thus, given TRW’s lack of explanation on this point, we are not persuaded that TRW has shown a reasonable likelihood of prevailing on this ground with respect to claim 1, and claims 3, 5-7, 10, 15, and 19, which depend from claim 1. IPR2014-00258 Patent 7,994,462 B2 14 C. Obviousness Grounds 1. Claims 1, 3, 5-8, 10, 15, 19, 21, 23, 26, and 27— Obviousness—Yanagawa, Bottesch, and Wilson-Jones TRW asserts that claims 1, 3, 5-8, 10, 15, 19, 21, 23, 26, and 27 are unpatentable under 35 U.S.C. § 103 as obvious over Yanagawa, Bottesch, and Wilson-Jones. Yanagawa describes a vehicle-mounted imaging apparatus that detects the headlights of oncoming vehicles and taillights of leading vehicles based on the “color features” of the lights and whether the lights are at the same height. Ex. 1005, 008-009. Yanagawa’s system dims the vehicle’s headlights, if necessary, in response to such detection. Id. at 007. Bottesch describes a “vehicle presence detecting system” that uses a “Passive Optical Sensor” (POS) comprised of two small hollow tubes having an oval cross-section. Ex. 1006, 2:34-43, 3:67-4:3. The rear interior end of each tube supports a photosensitive device comprised of a plurality of identical photocells arranged in an array. Id., 4:3-10. Wilson-Jones describes a video-based vehicle support system that detects other vehicles and lane markings. Ex. 1007, 005:40-41. TRW contends that Yanagawa discloses all of the limitations of the above claims except that it does not expressly teach a two-dimensional array of light sensing photosensor elements. Pet. 23-36. 2 TRW asserts that this limitation is found in Bottesch. Id. at 23 (citing Ex. 1006, 2:40-45, 4:10-13, 6:57-60). Further, TRW asserts that a person of ordinary skill would have had a reason to combine Bottesch with Yanagawa because: 2 TRW also contends that Wilson-Jones teaches pattern recognition based in part on shape, but that this teaching is “not required” to support this proposed ground of unpatentability. Pet. 27. IPR2014-00258 Patent 7,994,462 B2 15 To include in the camera 11 of Yanagawa (’837) at the time of the alleged invention a two-dimensional array of photosensor elements as shown in Bottesch (’681) is, under KSR, merely a simple substitution of one known element for another to obtain predictable results. Further, to the extent [held] that the two dimensional array is not present in Yanagawa (’837), Petitioner submits that those skilled in the art would consider it obvious to try to include such a two-dimensional array in the camera 11 of Yanagawa (’837), as this selection involves choosing from a finite number of identified, predictable solutions, with a reasonable expectation of success. Pet. 24 (citing MPEP § 2141; Ex. 1012 ¶ 38). TRW’s analysis falls short, as it is based on “mere conclusory statements” that cannot support an obviousness rejection. KSR Int’l Co. v. Teleflex, Inc., 550 U.S. 398, 418 (2007) (citing In re Kahn, 441 F.3d 977, 988 (2006)). TRW does not support its “simple substitution” or “obvious to try” rationales with explanation or evidence. TRW does not explain, for instance, why the alleged combination would be a “simple” substitution achieving “predictable results,” or why selecting Bottesch’s array, as opposed to any other, would be a choice from a “finite number” of solutions with a “reasonable expectation of success.” See Pet. 24. Dr. Miller’s testimony (Ex. 1012 ¶ 38) likewise is conclusory, as it simply repeats the above statements verbatim, and thus does not provide any support for TRW’s position. TRW’s reasons to combine Bottesch with Yanagawa thus lack a supporting “articulated reasoning with some rational underpinning” (Kahn, 441 F.3d at 988), and are, therefore, unpersuasive. Accordingly, we are not persuaded that TRW is reasonably likely to prevail in showing that the above claims would have been obvious over Yanagawa, Bottesch, and Wilson-Jones. For the same reasons, we are not persuaded that TRW is IPR2014-00258 Patent 7,994,462 B2 16 reasonably likely to prevail in showing that claim 9, which depends from claim 1, would have been obvious over Yanagawa, Bottesch, Wilson-Jones, and Tadashi. 2. Obviousness Grounds Based on Aurora and Zheng TRW proposes combining each of Aurora and Zheng with Kenue (or Yanagawa, Bottesch, and Wilson-Jones) to demonstrate the unpatentability of claim 11 (Aurora) and claims 12 and 13 (Zheng). Pet. 37-38. We have reviewed TRW’s analyses and conclude, as above, that TRW’s proposed reasons to combine each of Aurora and Zheng are based on conclusory statements without adequate supporting explanation or evidence. For example, TRW proposes combining Aurora with either Kenue or the Yanagawa-Bottesch-Wilson-Jones combination because “doing so would be nothing more than [the] use of a known technique to improve similar devices in the same way,” and “such a combination would also be tantamount to combining prior art elements according to known methods to yield predictable results.” Pet. 37. TRW made exactly the same assertions for combining Zheng with these references. Pet. 38. For the reasons discussed above, we are not persuaded that TRW is reasonably likely to prevail with respect to these grounds of unpatentability. 3. Claims 16 and 17—Vellacott with either (a) Kenue or (b) Yanagawa, Bottesch, and Wilson-Jones TRW asserts that the combination of Vellacott with either (a) Kenue or (b) Yanagawa, Bottesch, and Wilson-Jones renders claims 16 and 17 unpatentable under 35 U.S.C. § 103. Pet. 39-42. We determined above (§ II.D.2) that TRW has not adequately supported the combination of Yanagawa and Bottesch, and TRW relies on its previous arguments IPR2014-00258 Patent 7,994,462 B2 17 regarding that combination. See Pet. 40-41. The combination of Vellacott with Yanagawa, Bottesch and Wilson-Jones, therefore, is likewise unsupported. Moreover, we have previously (§ II.C) determined that TRW is not reasonably likely to show that Kenue teaches all of the limitations of claim 1, from which claims 16 and 17 depend. TRW does not rely on Vellacott as teaching the missing limitation. Pet. 39-42. Accordingly, we determine that TRW is not reasonably likely to show that the combination of Kenue and Vellacott renders claims 16 and 17 unpatentable. III. CONCLUSION For the foregoing reasons, we determine that TRW is not reasonably likely to prevail on any of the challenged claims, and, accordingly, we deny the Petition and do not institute an inter partes review of claims 1, 3, 5-13, 15-17, 19, 21, 23, 26, and 27 of the ’462 patent. IPR2014-00258 Patent 7,994,462 B2 18 PETITIONER: Josh Snider Timothy Sendek A. Justin Poplin LATHROP & GAGE LLP patent@lathropgage.com tsendek@lathropgage.com jpoplin@lathropgage.com PATENT OWNER: Timothy A. Flory Terence J. Linn GARDNER, LINN, BURKHART & FLORY, LLP Flory@glbf.com linn@glbf.com David K.S. Cornwell STERNE, KESSLER, GOLDSTEIN & FOX PLLC Davidc-PTAB@skgf.com Copy with citationCopy as parenthetical citation