Microsoft Corporationv.Ip-Learn Focus LLCDownload PDFPatent Trial and Appeal BoardApr 11, 201613481821 (P.T.A.B. Apr. 11, 2016) Copy Citation Trials@uspto.gov Paper No. 33 571-272-7822 Entered: April 11, 2016 UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________ Microsoft CORPORATION, Petitioner, v. IpLearn-Focus, LLC, Patent Owner. ____________ Case IPR2015-00095 Patent 8,475,174 B2 ____________ Before MICHAEL W. KIM, RICHARD E. RICE, and JEREMY M. PLENZLER, Administrative Patent Judges. RICE, Administrative Patent Judge. FINAL WRITTEN DECISION 35 U.S.C. § 318(a) and 37 C.F.R. § 42.73 IPR2015-00095 Patent 8,475,174 B2 2 I. INTRODUCTION A. Summary Microsoft Corporation (“Petitioner”) filed a Petition (Paper 1, “Pet.”) requesting an inter partes review of claims 1, 2, 16–18, 22, 37–42, 48, 55, and 56 of U.S. Patent No. 8,475,174 B2 (Ex. 1001, “the ’174 Patent”). We instituted a trial as to only claims 1, 2, 18, 22, 37–42, 48, 55, and 56. Paper 12 (“Inst. Dec.”), 30.1 After institution, IpLearn-Focus, LLC (“Patent Owner”) filed a Patent Owner Response (Paper 19, “PO Resp.”), to which Petitioner filed a Reply (Paper 23, “Pet. Reply”). The parties relied at trial on the following references, declarations, and deposition testimony: Reference Patent No./Title Date Exhibit Black US 5,774,591 June 30, 1998 (filed Dec. 15, 1995) Ex. 1005 Garwin EP 0 055 338 A1 July 7, 1982 Ex. 1006 Hutchinson Thomas E. Hutchinson et al., Human–Computer Interaction Using Eye-Gaze Input, 19:6 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS 1527 (Nov./Dec. 1989) Nov./Dec. 1989 Ex. 1004 Declaration of David Forsyth Ex. 1003 Declaration of David Crane Ex. 2009 1 Patent Owner filed a disclaimer under 37 C.F.R. § 1.321(a) of challenged claims 16 and 17 of the ’174 Patent. See Ex. 2006. IPR2015-00095 Patent 8,475,174 B2 3 Transcript of Deposition of David Crane Ex. 1030 Transcript of Deposition of David Forsyth Ex. 2010 The grounds for trial were as follows: References Basis Claims Challenged Hutchinson § 102(b) 1, 37, 38, 40–42, 48, 55, and 56 Hutchinson § 103(a) 1, 2, 37, 38, 40–42, 48, 55, and 56 Black § 102(b) 1, 2, 18, and 22 Black and Garwin § 103(a) 37 and 39 An oral hearing was held on December 17, 2015. The transcript of the oral hearing has been entered into the record. Paper 32 (“Tr.”). We have jurisdiction under 35 U.S.C. § 6(c). The evidentiary standard is a preponderance of the evidence. See 35 U.S.C. § 316(e); 37 C.F.R. § 42.1(d). This Final Written Decision is issued pursuant to 35 U.S.C. § 318(a) and 37 C.F.R. § 42.73. For the reasons explained below, we determine that Petitioner has shown by a preponderance of the evidence that claims 1, 2, 18, 22, 37, and 39, but not claims 38, 40–42, 48, 55, and 56, are unpatentable. B. Related Proceedings Petitioner and Patent Owner are parties in a federal district court case involving the ’174 Patent (IpLearn-Focus, LLC v. Microsoft Corp., No. 3:14-cv-00151-JD (N.D. Cal.)). Pet. 2; Paper 5, 1. They also are parties IPR2015-00095 Patent 8,475,174 B2 4 in an inter partes review involving a patent (U.S. Patent No. 8,538,321 B2, “the ’321 Patent”) that is related to the ’174 Patent. Paper 5, 1–2; see IPR2015-00097, Paper 11. C. The ’174 Patent The ’174 Patent relates to a computer learning system including a detached sensor to monitor a student’s behavior. Ex. 1001, 1:18–20, 1:61– 2:14. In one embodiment, the system includes a presenter, a non-intrusive sensor, a controller, and an indicator. Id. at 2:4–5. In this embodiment, the presenter presents study materials to a student, the sensor automatically senses the student’s concentration-sensitive behavior, the controller analyzes the behavior based on one or more rules, and the indicator indicates the student’s concentration level based on the analysis. Id. at 2:4–2:13. The Specification states that a student’s loss of concentration can be measured, for example, using images of the student’s face. Id. at 8:47–9:40. Upon determining a loss of concentration, the system can react, for example, by changing the study materials. Id. at 11:54–62. The Specification describes calibrating a student’s “concentration- sensitive behavior.” Id. at 2:54–56. “One type of calibration establishes the student’s behavior when the student is paying attention, and compares it with the student’s behavior when the student is working on the study materials.” Id. at 2:57–60. In an embodiment, sensor 110 includes digital camera 180, which is positioned adjacent to monitor 178 such that it can take pictures of the student’s face while the student is looking at the monitor. Id. at 8:47–55. IPR2015-00095 Patent 8,475,174 B2 5 “To improve the performance of this embodiment, before the step of monitoring, the present invention includes the step of calibration through imaging.” Id. at 8:59–61. The Specification describes, as an example, asking the student to look at a message box displayed on the monitor so that the digital camera can take a reference image of the student’s face. Id. at 8:64–9:2. After calibration, digital camera 180 regularly captures the student’s facial image. Id. at 9:22–24. According to the Specification, a rule for use with this embodiment is: “If the student’s facial orientation is significantly different from its reference image as shown in two consecutive monitoring processes, the student has lost concentration in the study materials.” Id. at 9:37–40. In another embodiment, study materials are presented in a “multi- windows environment,” with the student entering inputs into the system using, for example, a mouse or a keyboard. Id. at 8:20–23. A rule for use with this embodiment is that: “If for a predetermined period of time, the inputs have been entered outside the window where the study materials reside, the student has lost concentration in the study materials.” Id. at 8:32– 34. D. Illustrative Claims Claims 1, 16, 22, and 37 are independent. Claim 2 depends from claim 1; claim 18 depends from claim 16; and claims 38–42, 48, 55, and 56 depend from claim 37. Claims 1 and 37 are illustrative and are reproduced below: 1. A computer-implemented method for a user using at least a first window of a display, with IPR2015-00095 Patent 8,475,174 B2 6 at least some of the space outside of the first window being viewable via the display, with the space outside of the first window including at least a part of another window at least partially viewable via the display, the method comprising: acquiring data, by a computing device, regarding a volitional input of the user, the data from a sensor sensing a physical attribute associated with the user, the sensor being configured to be detached from the physical attribute associated with the user to sense at least the physical attribute; analyzing the data by the computing device, which is coupled to the sensor and the display; determining by the computing device from the analyzing of the data whether the volitional input suggests the user paying attention to materials in the first window or not to materials in the first window, whereby the volitional input suggesting the user not paying attention to materials in the first window indicates a change in situation from the volitional input suggesting the user paying attention to materials in the first window; and adjusting, by the computing device, materials for presenting to the user if the determining determines that the volitional input suggests the user not paying attention to materials in the first window, wherein the sensor includes an imaging sensor. Ex. 1001, 13:52–14:11. 37. A computing system for a user using a display that is configured to display a first window, with at least some of the space outside of the first window being viewable via the display, and with the space outside of the first window including at least a part of another window at least partially IPR2015-00095 Patent 8,475,174 B2 7 viewable via the display, the computing system comprising: a sensor configured to: sense, in a session, an area of the user to produce a first piece of data regarding a volitional behavior of the user, the first piece of data being electronically linked to an identity of the user, with the sensor being configured to be detached from the area of the user to sense the area; and sense, in another session, an area of the user to produce a second piece of data regarding another volitional behavior of the user, with the sensor being configured to be detached from the area of the user to sense the area; and a processor coupled to the sensor and the display, the processor configured to compare at least the first piece of data with the second piece of data to help identify materials for presenting to the user via the display, wherein the sensor includes a camera, and wherein the first piece of data includes an image of the user. Id. at 19:7–30. II. ANALYSIS A. Claim Construction; Level of Skill in the Art In an inter partes review, the Board gives claim terms in an unexpired patent their broadest reasonable interpretation in light of the specification of the patent in which they appear. 37 C.F.R. § 42.100(b); see also In re Cuozzo Speed Techs., LLC, 793 F.3d 1268, 1278, 1279 (Fed. Cir. 2015) (“We conclude that Congress implicitly approved the broadest reasonable IPR2015-00095 Patent 8,475,174 B2 8 interpretation standard in enacting the AIA” and “the standard was properly adopted by PTO regulation.”), cert. granted sub nom. Cuozzo Speed Techs. LLC v. Lee, 136 S. Ct. 890 (mem.) (2016). Under that standard, a claim term generally is given its ordinary and customary meaning, as would be understood by one of ordinary skill in the art in the context of the entire disclosure. In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007). While our claim interpretation cannot be divorced from the specification and the record evidence, see Microsoft Corp. v. Proxyconn, Inc., 789 F.3d 1292, 1298 (Fed. Cir. 2015) (quoting In re NTP, Inc., 654 F.3d 1279, 1288 (Fed. Cir. 2011)), we must be careful not to import limitations from the specification that are not part of the claim language. See Superguide Corp. v. DirecTV Enterprises, Inc., 358 F.3d 870, 875 (Fed. Cir. 2004). Any special definition for a claim term must be set forth in the specification with reasonable clarity, deliberateness, and precision. In re Paulsen, 30 F.3d 1475, 1480 (Fed. Cir. 1994). The parties propose different definitions of a person of ordinary skill in the art (“POSITA”). According to Petitioner’s expert, Dr. David Forsyth, a POSITA would have had a combination of experience and education in computer vision and the design of human-computer interfaces. This typically would consist of a minimum of a bachelor of science in Computer Science or Electrical Engineering (or a related engineering field) plus either a year of graduate training or 2-4 years of relevant experience. The POSITA also would have been familiar with the design of, theory behind, principles of operation of, intended use of, and the underlying technology used in computer vision and human-computer interfaces. IPR2015-00095 Patent 8,475,174 B2 9 Ex. 1003 ¶ 27. In contrast, Patent Owner’s expert, Mr. David Crane, testifies that a POSITA “would have had a bachelor’s degree or equivalent in the field of computer engineering, computer science or electrical engineering and at least two to three years of experience relating to human- machine interface design.” Ex. 2009 ¶ 15. Based on the competing arguments and evidence of record, we largely accept Patent Owner’s proposed definition. The definition proposed by Petitioner appears to be overly narrow because of its focus on “computer vision.” We have made minor modifications to Patent Owner’s definition to clarify the educational requirement, and to broaden the experience requirement to include at least a year of graduate training as an alternative to work experience, consistent with the background of researchers in the field as indicated by prior art references such as Hutchinson. See Ex. 1004, 1533–1534. We determine that a POSITA would have had at least a bachelor’s degree in computer engineering, computer science, electrical engineering, or an equivalent field, and at least a year of graduate training, or two years of work experience, relating to human-machine interfaces. 1. “window” The preamble of claim 1 recites: [a] computer-implemented method for a user using at least a first window of a display, with at least some of the space outside of the first window being viewable via the display, with the space outside of the first window including at least a part of another window at least partially viewable via the display (emphasis added). The preamble of claim 37 similarly recites: IPR2015-00095 Patent 8,475,174 B2 10 [a] computing system for a user using a display that is configured to display a first window, with at least some of the space outside of the first window being viewable via the display, and with the space outside of the first window including at least a part of another window at least partially viewable via the display (emphasis added). In the Institution Decision, we determined that the broadest reasonable interpretation consistent with the Specification of the term “window” is a bounded sub-region of a computer display that is capable of displaying a distinctly different type of information than one or more other sub-regions of the computer display. Inst. Dec. 8 (citing Ex. 1001, 8:20–42). Neither of the parties proposes any change to that interpretation, and our review of the evidence does not indicate that any change is necessary. Consequently, we maintain our previous interpretation. 2. “a speed of the user” Claim 22 requires an imaging sensor that produces “data regarding volitional behaviors of the user at different times” and a controller that uses “a rule regarding a speed of the user to analyze the volitional behaviors of the user to help determine what to present to the user” (emphasis added). In the Petition, Petitioner proposed to construe the term “speed” as “velocity or rate; rate of change of position or angle with respect to time.” Pet. 9. In the Patent Owner Response, Patent Owner did not propose a construction for “speed” or “speed of the user.” The ordinary meaning of “speed” is “rate of performance or action.” See MERRIAM WEBSTER’S COLLEGIATE DICTIONARY 1129 (10th ed. 1993) (Ex. 3001). The Specification uses the term “speed” in accordance with its ordinary meaning. See, e.g., Ex. 1001, 7:30–31 (“In one embodiment, the IPR2015-00095 Patent 8,475,174 B2 11 sensor 110 monitors the speed of inputs by the student as a function of time. There are different ways to monitor the speed of inputs, such as polling periodically the corresponding devices of those inputs.”), 7:36–41 (“As the student starts working on the study materials, the inputs are entered at a certain speed. As the student gets tired, or as the student loses concentration, this speed typically decreases.”), 8:4–7 (“If the speed of the student’s volitional inputs across a predetermined period of time is significantly lower than the reference speed, the student has lost concentration in the study materials.”), 10:23–27 (“[I]f the student’s eyes are wide open with his inputs through the mouse moving down the study materials in a fairly constant speed for a long duration of time, such as five minutes, the student’s concentration level is also high.”). We determine that that the broadest reasonable interpretation consistent with the Specification of the term “a speed of the user” is “a rate of performance or action of the user.” 3. “session” Claim 37 recites “a sensor configured to: sense, in a session, an area of the user to produce a first piece of data regarding a volitional behavior of the user, the first piece of data being electronically linked to an identity of the user”; and “sense, in another session, an area of the user to produce a second piece of data regarding another volitional behavior of the user” (emphasis added). Petitioner proposes to construe the term “session” based on an ordinary dictionary as “a period or term devoted to a particular activity.” IPR2015-00095 Patent 8,475,174 B2 12 Pet. 10; Pet. Reply 1–2. Petitioner asserts that the term “is used only twice in the specification: once for a ‘class session,’ and separately, for a ‘working session.’” Id. at 2 (citing Ex. 1001, 4:5–8; 10:64–67) (emphasis supplied by Petitioner). Petitioner argues that “[t]he applicants used the term without a qualifier in the claims, which means those claims are not limited to any particular kind of session.” Id. Patent Owner criticizes Petitioner’s dictionary definition as overly broad, and asserts, based on Mr. Crane’s declaration testimony, that: “‘In computing systems, a session begins when a user boots up an application and ends when he or she closes that application, for example, switching permanently to another application or shutting down the computer.’” PO Resp. 10–11 (quoting Ex. 2009 ¶ 30). Patent Owner further asserts, based on Mr. Crane’s testimony: “‘A software application session is made up of many processes and routines, including loading, title, calibration, menu, and main application, among others. It would be contrary to the plain and ordinary meaning of the term “session” to describe any of these processes or routines as separate sessions.’” Id. at 11 (quoting Ex. 2009 ¶ 46).2 The Specification uses the term “session” in two different ways. In referring to a “class session,” the Specification uses the term “session” as a continuous period of time devoted to a particular activity. Ex. 1001, 4:5–8 (“The teaching period may last one semester or a year, or just one class session.”). In referring to a student’s “working session” at the computer, 2 Patent Owner erroneously cites paragraph 30 of Mr. Crane’s Declaration in the Patent Owner Response. IPR2015-00095 Patent 8,475,174 B2 13 however, the Specification uses the term “session” as the period of time between booting up and closing an application, as discussed further below. See Ex. 1001, 10:58–11:3. In the “working session” embodiment, system 100 asks for the student’s identity “when the student starts working on the study materials,” and the user’s identity is stored in the system. Id. at 10:58–61. “The student’s reference information, whether static or dynamic, is stored with the student’s identity in the memory of the system 100.” Id. at 10:61–63. “After the first working session, if the student wants to work on study materials through the system 100 again,” i.e., at the start of the second or next working session, the system retrieves from its memory the student’s reference information Id. at 10:64–67. In this embodiment, “the retrieved reference information can replace the step of calibration.” Id. at 10:67–11:1. Petitioner’s proposed construction of “session” is not consistent with the embodiment just discussed. Specifically, under Petitioner’s proposed construction, there would be no reason to retrieve the user’s reference information between consecutive sessions. For example, if the student were to engage in two working sessions performed consecutively, the first session being devoted to work involving a first set of materials and the second session being devoted to work involving a second set of materials, there would be no apparent reason to retrieve the user’s reference information from memory at the start of the second working session. In contrast, Patent Owner’s proposed construction is fully consistent with the “working session” embodiment. The “first working session” would encompass the entire period of time from “when the user starts working on IPR2015-00095 Patent 8,475,174 B2 14 the study materials” (i.e., on booting-up the application) to “[a]fter the first working session” (i.e., on closing the application). By retrieving the user’s reference materials from memory at the start of the second or next working session (i.e., on booting-up the application again), the step of calibration can be avoided. Contrary to Petitioner’s argument, the qualifier “working” in “working session” does not indicate that the term “session” has the broad meaning asserted by Petitioner. See Pet. Reply 2. The qualifier indicates only that the term “session” is broader than the term “working session.” We determine that the broadest reasonable interpretation consistent with the Specification of the term “session,” as recited in claim 37, is the period of time between booting up and closing an application on the computer, for example, by switching permanently to another application or shutting down the computer. 4. “waiting for a period of time before adjusting” Claim 1 recites a computer-implemented method including steps of “determining” and “adjusting.” The “adjusting” step requires “adjusting . . . if the determining determines that the volitional input suggests the user not paying attention to materials in the first window.” Claim 2 recites “[a] computer-implemented method as recited in claim 1, further comprising waiting for a period of time before adjusting” (emphasis added). Patent Owner contends that “[c]laim 2 logically requires that the ‘waiting before adjusting step’ occur after the determining step reaches a determination.” PO Resp. 15. According to Patent Owner, “[t]his is IPR2015-00095 Patent 8,475,174 B2 15 because the recited ‘adjusting’ and ‘waiting before adjusting’ are only performed ‘if the determining determines that the volitional input suggests the user not paying attention to materials in the first window.’” Id. Patent Owner further contends: “There would be no ‘adjusting’ or ‘waiting before adjusting’ if the determining does not make this particular determination.” Id. Petitioner argues that Patent Owner’s contention is inconsistent with the Specification, which describes an embodiment in which “waiting for a period of time” begins prior to “determining,” and “determining” depends on the completion of “waiting.” Pet. Reply 3 (citing Ex. 1001, 2:29–36). In the cited embodiment, the controller determines behavior based on the rule that “if the student is not looking at the monitor showing the study materials for a predetermined period of time, the student has lost concentration in the study materials.” Ex. 1001, 2:29–32. We agree with Petitioner that “waiting for a period of time” is a part of the “determining” step in that embodiment. See also id. at 2:23–29, 8:20–46 (describing other embodiments in which “waiting” is a part of “determining”). As described in the Specification, changing the study materials (“adjusting”) is a potential reaction of the system depending on the behavior “determined” after “waiting” for the predetermined period of time. Id. at 34–37. Patent Owner has not persuaded us that the “determining” step must be performed before the “waiting” step. See Altiris, Inc. v. Symantec Corp., 318 F.3d 1363, 1370 (Fed. Cir. 2003) (holding that “nothing in the intrinsic evidence indicates that the ‘setting’ step must be performed before the ‘booting normally’ step”); Baldwin Graphic Sys. Inc. v Siebert Inc., IPR2015-00095 Patent 8,475,174 B2 16 512 F.3d 1338, 1345 (Fed. Cir. 2008) (“[A]lthough a method claim necessarily recites the steps of the method in a particular order, as a general rule the claim is not limited to performance of the steps in the order recited, unless the claim explicitly or implicitly requires a specific order.”) (citing Interactive Gift Express, Inc. v. Compuserve Inc., 256 F.3d 1323, 1342–43 (Fed.Cir.2001)). We determine that Patent Owner’s asserted claim construction is overly narrow, because claim 2 does not recite waiting for a period of time after determining and before adjusting; claim 2 simply recites “waiting for a period of time before adjusting.” Alternatively, Patent Owner argues that “even if, arguendo, the ‘determining step’ did not have to precede the ‘waiting before adjusting’ step, these steps are still distinct steps” and, “[t]herefore, the ‘waiting before adjusting’ step cannot be the same step as ‘determining.’” PO Resp. 15. Patent Owner has not persuaded us that the required “waiting for a period of time” cannot be a part of, or included in, the “determining” step, as discussed above. Further, contrary to Patent Owner’s argument, the step of “waiting for a period of time before adjusting” can overlap, or coincide with, the “determining” step, without being “the same step as ‘determining.’” See id. In the embodiment cited by Petitioner, for example, the step of “waiting for a period of time before adjusting” overlaps, or coincides with, the predetermined period of time during which the controller determines that the student is not looking at the monitor showing the study materials. Ex. 1001, 2:29–37. We determine, under the broadest reasonable interpretation standard, that claim 2 encompasses the “waiting for a period of time” as part of the IPR2015-00095 Patent 8,475,174 B2 17 “determining” step, and that the step of “waiting for a period of time before adjusting” can overlap, or coincide with, the “determining” step. 5. “the determining determines that the volitional input suggests the user not paying attention to materials in the first window” Claim 1 recites “adjusting, by the computing device, materials for presenting to the user if the determining determines that the volitional input suggests the user not paying attention to materials in the first window” (emphasis added). Neither party proposes an express construction for “determining . . . the user not paying attention to materials in the first window.” Patent Owner asserts an implicit construction, however, in its arguments relating to Black,3 as discussed below. PO Resp. 53–55. Patent Owner argues that Black’s system “at most takes actions based on where the user is looking and where the user is paying attention.” Id. at 54. Implicit in Patent Owner’s argument is that “determining . . . the user not paying attention to materials in the first window” requires the computing device, itself, to determine whether the user is not paying attention to materials in the first window. Petitioner argues, in reply, that taking actions based on where the user is looking and paying attention “necessarily ‘suggests the user not paying attention to materials in the first window’ as required by the claim, since his or her attention is directed elsewhere.” Pet. Reply 13. 3 Black discloses a facial expression and gesture recognition system. Ex. 1005, 6:66–67. IPR2015-00095 Patent 8,475,174 B2 18 Patent Owner’s implicit claim construction is consistent with the Specification, which describes a computing device that determines, itself, whether the user is not paying attention to study materials displayed on a monitor. For example, one embodiment determines whether the user is not paying attention to study materials (i.e., has lost concentration) by comparing images of a student’s facial orientation to a reference image indicative of the user’s facial orientation when paying attention (i.e., looking at the monitor). See Ex. 1001, 9:18–9:40. We determine that the broadest reasonable interpretation consistent with the Specification of “determining . . . the user not paying attention to materials in the first window” requires the computing device, itself, to determine whether the user is not paying attention to materials in the first window. B. Asserted Anticipation of Claims 1, 37, 38, 40–42, 48, 55, and 56 by Hutchinson To anticipate a patent claim under 35 U.S.C. § 102, “a single prior art reference must expressly or inherently disclose each claim limitation.” Finisar Corp. v. DirecTV Group, Inc., 523 F.3d 1323, 1334 (Fed. Cir. 2008). Under the principles of inherency, if the prior art necessarily functions in accordance with, or includes, the claimed limitations, it anticipates, even though artisans of ordinary skill may not have recognized the inherent characteristics or functioning of the prior art. MEHL/Biophile Int’l Corp. v. Milgraum, 192 F.3d 1362, 1365 (Fed. Cir. 1999) (citation omitted); In re Cruciferous Sprout Litig., 301 F.3d 1343, 1349–50 (Fed. Cir. 2002). IPR2015-00095 Patent 8,475,174 B2 19 Petitioner challenges claims 1, 37, 38, 40–42, 48, 55, and 56 as anticipated by Hutchinson. Pet. 11–19, 29–41. As discussed below, we are persuaded that Hutchinson anticipates claim 1, but not claims 37, 38, 40–42, 48, 55, and 56. 1. Overview of Hutchinson Hutchinson discloses a computer system that determines whether a user’s attention is directed to a particular menu box in order to select that box from multiple menu boxes presented by a display. E.g., Ex. 1004, 1527–1530. Figure 3(b) of Hutchinson is reproduced below. Figure 3(b) is a schematic that depicts a device called the eye-gaze- response interface computer aid (“Erica”), and shows how eye gaze operates. Id. at 1527, 1529. A principal goal of Erica is to help the physically and vocally disabled, including quadriplegics. Id. at 1527. Hutchinson discloses that staring at one of the commands, or menu options, displayed on the computer screen for a period of time triggers the system. Id. at 1529, 1530. Hutchinson further discloses that “[w]hen the IPR2015-00095 Patent 8,475,174 B2 20 user’s eye-gaze is fixed for this period, a tone sounds and an icon (cursor) appears in the menu box in line with the gaze.” Id. at 1530. If the user continues to stare at the command or menu option after the tone sounds and the icon appears, a second tone sounds and the selected command or option is performed. Id. Hutchinson discloses that “[t]he purpose of the auditory and visual feedback is to allow the user a moment to change or abort the enabled option by altering his or her gaze accordingly.” Id. (emphasis added). Using a tree-structured menu hierarchy, the user may select from four application areas―environmental control and non-vocal communication of personal needs; communications, including word processing; recreation, including computer games; and text reading. Id. at 1530. In the environmental control application, a menu option labeled “call the nurse” activates a loud buzzer. Id. at 1531. Other options allow the user to communicate thirst, pain, etc. and to specify, for example, the body region associated with the pain. Id. In the reading application, the root menu provides a subject index of stored text files. Id. Selecting a subject category brings-up a list of the file names. Id. When a file is selected for reading, only the bottom row of menu boxes is enabled. Id. Two boxes are used to turn pages, backwards and forwards, and the third to call-up a submenu. Id. “Options of the submenu allow the user to place a bookmark on the current page, select an alternate text, and exit the application.” Id. IPR2015-00095 Patent 8,475,174 B2 21 2. Analysis a. Claim 1 Upon review of the competing arguments and evidence presented by the parties, we determine that Petitioner has shown by a preponderance of the evidence that Hutchinson anticipates claim 1. See Pet. 14–15, 29–32; Pet. Reply 3–6; Ex. 1003 ¶¶ 46–51. As set forth in Petitioner’s claim chart, Hutchinson discloses each limitation of claim 1, including the “acquiring,” “analyzing,” “determining,” and “adjusting” steps. See Pet. 29–32. For example, the “adjusting” step requires “adjusting, by the computing device, materials for presenting to the user if the determining determines that the volitional input suggests the user not paying attention to materials in the first window.” Corresponding to that step, Hutchinson discloses that, after the user gazes for a period of time at a menu option in a display window, a tone sounds and an icon (cursor) appears in the display window in line with the gaze, but the enabled option is aborted if the user does not continue to gaze at the menu option for an additional period of time. Ex. 1004, 1530; see Pet. 31–32; Pet. Reply 3–6; Ex. 1003 ¶¶ 49–50. The abort mechanism of Hutchinson’s computing device discloses the “adjusting” step by aborting the enabled option (“adjusting . . . materials for presenting to the user”) if the computing device determines, itself, that the user is not directing his or her gaze at the menu option in the window (“if the determining determines that the volitional input suggests the user not paying attention to materials in the first window”). See Ex. 1004, 1530; supra Section II.A.5. IPR2015-00095 Patent 8,475,174 B2 22 Patent Owner’s opposing arguments with respect to claim 1 are not persuasive. PO Resp. 23–28. Patent Owner argues that the Petition does not demonstrate that Hutchinson discloses Claim 1, at least because the Petition fails to show that Hutchinson expressly or inherently discloses ‘determin[ing] that the volitional input suggests the user not paying attention to materials in the first window.” Id. at 24. Patent Owner further argues that “[c]omputers can only identify that which they are programmed to identify,” id. (citing Ex. 2008, 68:13–16), and Hutchinson’s computer “is not programmed to determine whether the user is or is not paying attention,” id. at 25 (citing Ex. 2008, 61:19–62:1). Patent Owner also asserts that, “[i]n Hutchinson, a user’s eye gaze is not necessarily a proxy for attention and does not necessarily identify whether the user is not paying attention to materials in the first window.” Id. Patent Owner explains: This is because in the context of Hutchinson, the user uses his or her eye gaze to control the computer. The user may type, make selections, and/or turn the pages of a book, all solely through the movement of his or her eyes. Thus, the user of Hutchinson may need to look away from a “window” the user is paying attention to so as to perform a command. In this context, where the user’s eye functions like a remote control, a mouse or a keyboard, the user’s eye gaze does not necessarily correspond to what he or she is or is not paying attention to. Id. at 25–26. The above arguments of Patent Owner do not address Hutchinson’s abort mechanism. Contrary to Patent Owner’s arguments, we find that the abort mechanism is programmed to determine whether the user is or is not paying attention to materials in the first window, for the reasons discussed above. IPR2015-00095 Patent 8,475,174 B2 23 Patent Owner also argues that: “[c]hoosing to make or abort a selection does not determine whether the user is paying attention to the selected item”; “[a]borting, for example, a ‘next page’ command by directing the user’s eye gaze away from the box does not indicate the user is not paying attention to that box”; and “[w]illful choices to abort a provisionally selected menu box does not evidence ‘not paying attention’ to the ‘next page’ button.” Id. at 28. These arguments also are unpersuasive. Hutchinson’s computing device, when implementing the abort mechanism, determines, itself, whether the user is not directing his or her gaze at the contents of the display window. Not directing his or her gaze at the contents of the display window corresponds to “not paying attention to materials in the first window,” as recited in the claim. Whether the user of Hutchinson’s device is thinking about the display window when not directing his or her gaze at the window is irrelevant, because eye gaze direction is the sole feature used by Hutchinson’s computing device to determine whether the user is not paying attention to the contents in the window. Indeed, Patent Owner’s contention would appear to confer mind-reading abilities on a computer that are completely divorced from physical stimuli. We are unpersuaded such a position is logical. For the reasons given, we conclude that Petitioner has shown by a preponderance of the evidence that Hutchinson anticipates claim 1. b. Claims 37, 38, 40–42, 48, 55, and 56 Claim 37 recites “a sensor configured to: sense, in a session, an area of the user to produce a first piece of data regarding a volitional behavior of the user, the first piece of data being electronically linked to an identity of IPR2015-00095 Patent 8,475,174 B2 24 the user”; and “sense, in another session, an area of the user to produce a second piece of data regarding another volitional behavior of the user” (emphasis added). Claim 37 also recites “a processor coupled to the sensor and the display, the processor configured to compare at least the first piece of data with the second piece of data to help identify materials for presenting to the user via the display.” Claims 38, 40–42, 48, 55, and 56 depend, directly or indirectly, from claim 37. Claims 37, 38, 40–42, 48, 55, and 56, therefore, each require a processor that compares data from different sessions. Upon review of the competing arguments and evidence presented by the parties, we are not persuaded that Hutchinson discloses a processor that compares data from different sessions. Petitioner argues that Hutchinson discloses a processor that compares sensed data from “a calibration session” with sensed data from “another session (normal operation of the system).” Pet. 16–17, 35–36 (citing Ex. 1003 ¶¶ 28, 64–68, 71); Pet. Reply 6–8. Petitioner further argues that the calibration data “is necessarily electronically linked to the identity of the user because it is specific for the particular user, and the system uses the calibrated parameters during another session (normal operation of the system).” Pet. 16–17 (citing Ex. 1003 ¶¶ 28, 64–66). Under our claim interpretation, the term “session,” as recited in claim 37, means the period of time between booting up and closing an application on the computer, for example, by switching permanently to another application or shutting down the computer. See supra Section II.A.3. Hutchinson discloses executing a calibration routine “at the IPR2015-00095 Patent 8,475,174 B2 25 beginning of each Erica session.” Ex. 1004, 1530; see Pet. 36–37 (citing Ex. 1003 ¶ 68). Contrary to Petitioner’s argument, the time devoted to executing a calibration routine at the beginning of each Erica session is not a separate “session” under a proper claim interpretation. Rather, the period of time devoted to the calibration routine and the subsequent period of time devoted to normal operation of the system are both part of the same “session” (i.e., calibration is the initial portion of a session followed by normal operation without permanent switching of applications or shutting down of the computer as required by our construction). See Ex. 1004, 1530. We credit the testimony of Mr. Crane that “Hutchinson does not disclose a multi-session system.” Ex. 2009 ¶ 48; see id. ¶¶ 46–49. For the reasons given, we conclude that Petitioner has not shown by a preponderance of the evidence that Hutchinson anticipates claims 37, 38, 40–42, 48, 55, and 56. C. Asserted Obviousness of Claims 2, 37, 38, 40–42, 48, 55, and 56 over Hutchinson A claim is unpatentable for obviousness under 35 U.S.C. § 103(a) if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which the subject matter pertains. See KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007). A patent claim composed of several elements, however, is not proved obvious merely by demonstrating that each of its elements was known, independently, in the prior art. Id. at 418. In analyzing the obviousness of a combination of prior art elements, it can be IPR2015-00095 Patent 8,475,174 B2 26 important to identify a reason that would have prompted one of skill in the art to combine the elements in the way the claimed invention does. Id. A precise teaching directed to the specific subject matter of a challenged claim is not necessary to establish obviousness. Id. Rather, “any need or problem known in the field of endeavor at the time of invention and addressed by the patent can provide a reason for combining the elements in the manner claimed.” Id. at 420. The question of obviousness is resolved on the basis of underlying factual determinations, including: (1) the scope and content of the prior art; (2) any differences between the claimed subject matter and the prior art; (3) the level of skill in the art; and (4) objective evidence of nonobviousness, i.e., secondary considerations, if in evidence. See Graham v. John Deere Co., 383 U.S. 1, 17–18 (1966). Petitioner challenges claims 2, 37, 38, 40–42, 48, 55, and 56 as obvious over Hutchinson. As discussed below, we are persuaded that claim 2, but not claims 37, 38, 40–42, 48, 55, and 56, would have been obvious over Hutchinson. 1. Claim 2 Claim 2 recites: 2. A computer-implemented method as recited in claim 1 further comprising waiting for a period of time before adjusting, wherein the adjusting comprises switching to alternative materials that are different from materials that would have been presented, wherein the alternative materials include materials on a product, and wherein at least a portion of the alternative IPR2015-00095 Patent 8,475,174 B2 27 materials are configured to be retrieved by the computing device, via a network, from another computing device. We agree with Petitioner that Hutchinson teaches the subject matter of claim 2. Pet. 20, 41–42; Pet. Reply 5–6; Ex. 1003 ¶¶ 52–55. We agree, for example, that Hutchinson discloses “waiting for a period of time before adjusting,” as required by claim 2, “so that the user has a moment to alter his or her gaze before the system adjusts what is being presented.” Pet. 20, 41 (citing Ex. 1004, 1530; Ex. 1003 ¶ 52). With respect to the “network” requirement, we also agree that a person of ordinary skill in the art would have known to retrieve materials from another computing device over a network. Id. at 20, 42 (citing Ex. 1003 ¶¶ 9, 15, 55). There is evidence of record that Black, for example, discloses such retrieval. Ex. 1003 ¶ 97. Further, we agree that Hutchinson discloses “adjusting comprises switching to alternative materials that are different from materials that would have been presented,” and that including “materials on a product” would have been obvious, as Petitioner contends. Pet. 20, 41–42 (citing Ex. 1004, 1530– 1531; Ex. 1003 ¶¶ 52–55). Patent Owner responds that: “the ‘waiting before adjusting’ step must be performed after the ‘determining’ step and, additionally, be a distinct step from the ‘determining step’”; and that “Hutchinson . . . does not disclose a ‘waiting before adjusting step’ that is either separate and distinct from the ‘determining’ step or subsequent to it.” PO Resp. 29. Patent Owner argues that, “[a]t most, Hutchinson discloses waiting as part of the determining step.” Id. at 30. Patent Owner’s argument is not consistent with our IPR2015-00095 Patent 8,475,174 B2 28 interpretation that claim 2 encompasses “waiting for a period of time” as part of the “determining” step. See supra Section II.A.4. Hutchinson’s system satisfies the “waiting for a period of time before adjusting” step by allowing the user “a moment” to change or abort an enabled option before the system performs it. Ex. 1004, 1530 (emphasis added). The system performs the option only if the user continues to stare at the enabled option for a period of time (“a moment”) after a tone sounds and an icon appears. Id. For the reasons given, we conclude that Petitioner has shown by a preponderance of the evidence that Hutchinson renders obvious claim 2. 2. Claims 37, 38, 40–42, 48, 55, and 56 Petitioner contends that any gaps in Hutchinson’s disclosure with respect to the limitations of challenged claims 37, 38, 40–42, 48, 55, and 56 would have been obvious over Hutchinson in view of the knowledge of a POSITA: to the extent these claims are interpreted . . . such that any limitations are considered not to be anticipated by Hutchinson alone, such limitations are also rendered obvious by Hutchinson, as a POSITA would have found such limitations obvious in Hutchinson’s disclosure in light of the ordinary knowledge of the POSITA and the state of the art known before August 13, 1996. Pet. 19 (citing Ex. 1003 ¶ 82). Petitioner generally relies on Dr. Forsyth’s declaration testimony to support its alternative obviousness challenges: Additionally, to the extent that any variances in claim scope are not necessarily shown by the support included in the below claim charts, such variances would have been obvious to a POSITA based on the level of ordinary skill in the pertinent art as it existed IPR2015-00095 Patent 8,475,174 B2 29 at the purported time of the invention for the reasons cited in the accompanying expert declaration of Dr. Forsyth. Id. (citing Ex. 1003 ¶¶ 9–12, 19–30, 41–82). Initially, we note that our Institution Decision indicated that “Petitioner has not presented an analysis sufficient to support modifying Hutchinson beyond the teachings of Hutchinson itself.” Paper 12, 13. As discussed above, Hutchinson does not disclose a processor that compares data from different sessions and, therefore, does not anticipate claims 37, 38, 40–42, 48, 55, and 56. See supra Section II.B.2.b. While Petitioner argues generally that any gaps in Hutchinson’s disclosure would have been obvious, Petitioner does not provide any analysis or reasoning in the Petition or the Reply to explain why a processor that compares data from different sessions would have been obvious over Hutchinson in view of the knowledge of a person of ordinary skill in the art. Indeed, as Patent Owner argues, the Petition does not discuss certain portions of Dr. Forsyth’s Declaration that appear to address the obviousness issue. See PO Resp. 38 n.6; see, e.g., Ex. 1003 ¶ 66.4 In the Patent Owner Response, Patent Owner characterizes those portions of the Declaration as “additional arguments not discussed in the Petition,” and states that Patent Owner is not responding to the undiscussed “arguments” so as to avoid any reply by Petitioner. PO Resp. 38 n.6. On this record, we determine that Petitioner has not provided 4 “[A] POSITA would be motivated to save the calibration results for the future so that calibration would not need to occur every time the system is used. In this area of technology, saving data in a computer program was very well known to a POSITA.” IPR2015-00095 Patent 8,475,174 B2 30 analysis or reasoning, with sufficient particularity, to establish the obviousness of claims 37, 38, 40–42, 48, 55, and 56 over Hutchinson. See 35 U.S.C. § 312(a)(3); 37 C.F.R. §§ 42.22(a), (c), 42.104(b)(4)–(5); KSR, 550 U.S. at 418 (stating that “‘rejections on obviousness grounds cannot be sustained by mere conclusory statements.’”) (quoting In re Kahn, 441 F.3d 977, 988 (Fed. Cir. 2006)). For the reasons given, we conclude that Petitioner has not shown by a preponderance of the evidence that Hutchinson renders obvious claims 37, 38, 40–42, 48, 55, and 56. D. Asserted Anticipation of Claims 1, 2, 18, and 22 by Black Petitioner contends that Black anticipates claims 1, 2, 18, and 22. Pet. 21–24, 42–52; Pet. Reply 12–15. As discussed below, we are persuaded that Black anticipates claims 18 and 22, but not claims 1 and 2. 1. Overview of Black Black discloses a system that tracks human head and facial features over time by analyzing a sequence of images. Ex. 1005, Abstract. By tracking head and eye motions of a user operating a computer, the system can detect where on the screen the user is looking. Id. at 27:18–20. 2. Analysis a. Claims 1 and 2 Claim 1 recites “adjusting, by the computing device, materials for presenting to the user if the determining determines that the volitional input IPR2015-00095 Patent 8,475,174 B2 31 suggests the user not paying attention to materials in the first window” (emphasis added). Claim 2 depends from claim 1. According to Petitioner, Black discloses all limitations of claims 1 and 2, including the “determining” requirement emphasized above. Pet. 22– 23, 42–47. Patent Owner responds that Black does not disclose the “determining” requirement because Black’s system “at most takes actions based on where the user is looking and where the user is paying attention.” PO Resp. 54. Petitioner replies that taking actions based on where the user is looking and paying attention “necessarily ‘suggests the user not paying attention to materials in the first window’ as required by the claim, since his or her attention is directed elsewhere.” Pet. Reply 13. We agree with Patent Owner. Petitioner’s arguments do not persuade us that Black’s computing device, itself, determines whether the user is not paying attention to materials in the first window, as required under our claim interpretation. See supra Section II.A.5. For the reasons given, we conclude that Petitioner has not shown by a preponderance of the evidence that Black anticipates claims 1 and 2. b. Claim 18 Claim 18 depends from claim 16.5 The limitations of claim 18 incorporated from claim 16 require, inter alia, a computer system comprising a controller configured to “analyze the data [from the imaging sensor regarding volitional behaviors that depend on at least facial 5 Claim 16 has been disclaimed, as discussed above. IPR2015-00095 Patent 8,475,174 B2 32 information of the user] to help determine what to present to the user . . . , wherein the system is configured to sense and analyze another volitional behavior of the user using another sensor to help determine what to present to the user, the another volitional behavior of the user being of a type different from the volitional behaviors of the user.” Claim 18 additionally requires that “the facial information is related to an eye behavior of the user.” Upon review of the competing arguments and evidence of record, we determine that Black discloses each limitation of claim 18. Pet. 47–50; see Ex. 1003 ¶¶ 98–104, 106; Pet. Reply 14. We are persuaded, for example, that Black discloses a system that analyzes the data from an imaging sensor to help determine what to present to the user based upon changes in the volitional behaviors of the user related to an eye behavior of the user. See, e.g., Pet. 47–50; Ex. 1005, 7:1–3, 28–45, 46–47, 15:16–27, 26:51–67, 27:2– 27, 28:55–59, 29:62–30:10, Figs. 4, 6, 7, 8A–8M; Ex. 1003 ¶¶ 98–104, 106. We also are persuaded that Black discloses a system “configured to sense and analyze another volitional behavior of the user using another sensor to help determine what to present to the user, the another volitional behavior of the user being of a type different from the volitional behaviors of the user.” See, e.g., Pet. 49–50; Ex. 1005, 27:7–17; Ex. 1003 ¶ 104. Patent Owner’s opposing argument is unpersuasive. See PO Resp. 57–58. The crux of Patent Owner’s argument is that claim 18 requires a determination of what to present to the user based upon the data from both of the sensors―i.e., the imaging sensor (related to a volitional eye IPR2015-00095 Patent 8,475,174 B2 33 behavior) and the “another sensor” related to “another volitional behavior of the user” (such as voice). Id. Patent Owner argues: Claim 18 requires, inter alia, a first “imaging sensor” to sense a “volitional behavior” of a user “related to an eye behavior of the user,” and “another sensor” to analyze another volitional behavior of the user of another type. It then uses data from both sensors to “determine what to present to the user.” Id. at 57. Patent Owner acknowledges that Black’s system 4 discloses, separately, determining what to present to the user based on eye motions and determining what to present to the user based on voice commands, but argues that “these separate disclosures do not teach that output from system 4 related to an eye of the user (the putative ‘imaging sensor’) is combined with data from a voice recognition system (the putative ‘another sensor’) to determine what to present to the user.” Id. at 58. We do not agree with Patent Owner that claim 18 requires a determination of what to present to the user based upon the data from both of the sensors. Nothing in the separate recitations “analyze the data [from the imaging sensor regarding volitional behaviors of the user] to help determine what to present to the user” and “analyze another volitional behavior of the user using another sensor to help determine what to present to the user” restricts the basis for making a determination to the combined data from the two sensors. Further, we are not informed of anything in the Specification or prosecution history that might support Patent Owner’s asserted claim construction. For the reasons given, we conclude that Petitioner has shown by a preponderance of the evidence that Black anticipates claim 18. IPR2015-00095 Patent 8,475,174 B2 34 b. Claim 22 Claim 22 recites: 22. A computer system for a user comprising: an imaging sensor configured to produce data regarding volitional behaviors of the user at different times; computer memory having materials for the user; and a controller associated with the computer memory, the controller configured to communicate with the imaging sensor and analyze the data to have materials retrieved for the user, based upon changes in the volitional behaviors, wherein the imaging sensor is configured to be detached from an area of the user to sense at least the area for producing the data, and wherein the controller is further configured to use a rule regarding a speed of the user to analyze the volitional behaviors of the user to help determine what to present to the user. Ex. 1001, 17:18–33. We agree that, as identified in Petitioner’s detailed claim chart, Black discloses each limitation of claim 22. Pet. 50–52; see Ex. 1003 ¶¶ 107–112; Pet. Reply 14–15. We are persuaded, for example, that Black discloses a controller that is “configured to use a rule regarding a speed of the user to analyze the volitional behaviors of the user to help determine what to present to the user” (emphasis added). See, e.g., id. at 23, 51–52; Ex. 1005, 7:39– 43, Table 4; see Ex. 1003 ¶ 112. The claim term “a speed of the user” means “a rate of performance or action of the user.” See supra Section II.A.2. Corresponding to the requirement for a “rule regarding a speed of the user,” Black discloses the gesture recognition rule that a “relatively long pause before reversal of action” is indicative of the gesture “Shift Attention.” Ex. 1005, Table 4. IPR2015-00095 Patent 8,475,174 B2 35 We do not agree with Patent Owner’s argument that “analyzing a user’s movement over time is not a rule regarding ‘a speed of the user.’” See PO Resp. 59. Here, the referenced gesture recognition rule is a rule regarding a rate of action of the user as required by our claim construction. For the reasons given, we conclude that Petitioner has shown by a preponderance of the evidence that Black anticipates claim 22. E. Asserted Obviousness of Claims 37 and 39 over Black and Garwin Petitioner challenges claims 37 and 39 as obvious over Black and Garwin. Pet. 52–56 (claim chart); Ex. 1003 ¶¶ 113–125, 127. As discussed below, we are persuaded that the combination of Black and Garwin renders obvious claims 37 and 39. 1. Overview of Garwin Garwin discloses an embodiment that “employs light reflected from the eyeball of a user to establish direction of gaze and employs parts of a partitioned display area as the individual discrete information input locations.” Ex. 1006, 3:19–23. Figure 2 of Garwin is reproduced below: IPR2015-00095 Patent 8,475,174 B2 36 Figure 1 illustrates user 1 viewing display 2, which is partitioned into 12 boxes. Id. at 2:5–6, 5:4–9. As shown in Figure 1, the user’s direction of gaze communicates to processer 3 the selection, and order of selection, of items from the 12 boxes. Id. at 5:9–13. A calibration program correlates measurements of a user’s gaze to aiming marks on a display surface, and calibration constants obtained from the measurements are saved in memory with the user’s identity for later use in equations for computing the user’s direction of gaze. Id. at 17:15–18:34. In general, the calibration constants are stable with time for a given user, and, in most cases, it is not necessary to do a complete recalibration. Id. at 18:15–18. Instead, the “stored calibration constants” are adjusted utilizing a simplified procedure. Id. at 18:18–34. 2. Analysis Petitioner contends that a POSITA would have combined the teachings of Black and Garwin “because they are directed to the same field IPR2015-00095 Patent 8,475,174 B2 37 of human-computer interfaces and describe similar systems.” Pet. 28 (citing Ex. 1003 ¶¶ 114–115). In support of this contention, Dr. Forsyth testifies that “both Black and Garwin describe methods to sense human actions and adjust displays accordingly,” and “[a] POSITA would have recognized that techniques from Black and Garwin could be used in these same scenarios and could be combined into functioning systems.” Ex. 1003 ¶ 114. Petitioner additionally contends that “a POSITA would have been prompted to modify Black’s system to incorporate calibration as disclosed in Garwin in order to enhance the accuracy and use of the system.” Pet. 28 (citing Ex. 1003 ¶¶ 28, 114–115). We determine that incorporating Garwin’s eye-gaze teachings in Black’s system would have been a predictable improvement. See Ex. 1003 ¶ 114. Black discloses tracking head and eye motions of a user to determine where the user is looking in order to make selections from windows on a computer screen, but does not disclose a calibration methodology. Ex. 1005, 27:18–27. Garwin teaches tracking a different volitional action of the user―the user’s direction of eye gaze―to determine where the user is looking in order to make selections from windows on a computer screen. Ex. 1006, 2:5–13, Fig. 1. In addition, Garwin teaches a calibration methodology for accurately measuring the direction of eye gaze of individual users. Ex. 1006, 17:15–18:34. We are persuaded that incorporating Garwin’s eye-gaze teachings in Black’s system would have amounted to nothing more than using a known technique to improve a similar system in the same way. See Pet. 28–29; KSR, 550 U.S. at 417 (“[I]f a technique has been used to improve one device, and a person of ordinary IPR2015-00095 Patent 8,475,174 B2 38 skill in the art would recognize that it would improve similar devices in the same way, using the technique is obvious unless its actual application is beyond his or her skill.”). Patent Owner’s arguments challenging Petitioner’s rationale for combining the references are unpersuasive. See PO Resp. 52–53. For example, Patent Owner argues that “the Petition fails to explain how ‘calibration as disclosed in Garwin’ can or should be implemented in Black so as to disclose the limitations of Claim 37.” Id. Patent Owner’s argument appears to focus on differences in the individual references rather than the teachings of the references combined. In particular, Patent Owner does not explain why the calibration methodology taught by Garwin could not be incorporated, without significant change, into the Black/Garwin combination. We agree with Petitioner, moreover, that the combination of Black and Garwin teaches the subject matter of claims 37 and 39. Pet. 24–27, 52– 56; Ex. 1003 ¶¶ 113–125, 127; see Pet. Reply 9–12. For example, we are persuaded that the asserted combination teaches a processor that compares data from different sessions. See supra Section II.B.2.b (discussing the different sessions requirement of claim 37 with respect to Hutchinson). In the combination of Black and Garwin, a user’s calibration data would be saved during an initial session and compared with data in a subsequent session of the user in order to determine the user’s direction of eye gaze. See Ex. 1006, 16:25–17:4, 17:15–18:34; Ex. 1003 ¶¶ 120–121, 123. In opposition, Patent Owner advances several arguments. PO Resp. 47. Patent Owner first argues that “Garwin does not teach the IPR2015-00095 Patent 8,475,174 B2 39 claimed multi-‘session’ approach because Garwin’s calibration is not a ‘session.’” Id. We are not persuaded by Patent Owner’s argument because claim 37 does not require a separate session devoted entirely to calibration. Patent Owner also argues that Garwin’s calibration constants are not “data regarding a volitional behavior of the user,” as recited in claim 37. Id. Specifically, Patent Owner argues that Garwin’s calibration constants C2, C3, D2 and D3 are obtained independent from any particular user and that Garwin’s remaining calibration constants, i.e., C1, C4–C8, D1, and D4–D8, while based on a particular user, depend on the internal details of the user’s eye and, for that reason, are not data regarding a volitional behavior of the user. PO Resp. 48–50. We disagree. As Petitioner argues in reply, Garwin’s calibration constants C1, C4–C8, D1, and D4–D8 are obtained by measuring the user’s volitional gaze to aiming marks on a display surface. See Pet. Reply 9–10 (citing Ex. 1006, 17:16–19, 17:34–18:2 and Ex. 1003 ¶ 120). As such, the calibration constants are data regarding a volitional behavior of the user. Whether the constants depend on the internal details of the user’s eye, as Patent Owner contends, is not relevant to the scope of the claim requirement. Patent Owner next argues that “Garwin does not teach ‘compar[ing]’ data from its calibration routine and its normal operation session as claimed.” PO Resp. 47. Specifically, Patent Owner argues that the constants C1–C8, D1–D8 simply are “multipliers” in equations used in Garwin for locating the user’s gaze on the screen, and “[t]he Petition does not show that any comparison is made during the normal session of Garwin between data obtained in the normal session and data obtained via calibration.” Id. IPR2015-00095 Patent 8,475,174 B2 40 at 51 (citing Ex. 1006, 16:14–17). We are not persuaded by Patent Owner’s argument, at least because it does not account for Garwin’s simplified recalibration methodology. See Ex. 1006, 18:15–34. During recalibration at the beginning of a normal operation session, the user directs his or her gaze to a single target at a known location, and a subroutine uses the stored calibration constants from a previous session, including Cl and Dl, to calculate the apparent location of the target as seen by the user. Id. at 18–25. The subroutine then “modifies the constants Cl and Dl so that the computed location agrees with the known location.” Id. at 26–27 (emphasis added). The disclosure of a recalibration subroutine that modifies the stored constants Cl and Dl based on data from the current session teaches a processor that compares data from different sessions, as the claim requires. For the reasons given, we conclude that Petitioner has shown by a preponderance of the evidence that claims 37 and 39 would have been obvious over Black and Garwin. III. CONCLUSION For the foregoing reasons, we determine that Petitioner has shown by a preponderance of the evidence that claims 1, 2, 18, 22, 37, and 39 are unpatentable, but has not shown by a preponderance of the evidence that claims 38, 40–42, 48, 55, and 56 are unpatentable. IV. ORDER In view of the foregoing, it is hereby: ORDERED that claims 1, 2, 18, 22, 37, and 39 of U.S. Patent No. 8,475,174 B2 are unpatentable; and IPR2015-00095 Patent 8,475,174 B2 41 FURTHER ORDERED that claims 38, 40–42, 48, 55, and 56 of U.S. Patent No. 8,475,174 B2 are patentable. This is a Final Written Decision. Parties to the proceeding seeking judicial review of the decision must comply with the notice and service requirements of 37 C.F.R. § 90.2. IPR2015-00095 Patent 8,475,174 B2 42 PETITIONER: John C. Phillips David B. Conrad Jonathan Lamberson FISH & RICHARDSON P.C. phillips@fr.com conrad@fr.com lamberson@fr.com IPR35797-0015IP1@fr.com PATENT OWNER: Kenneth J. Weatherwax Parham Hendifar GOLDBERG, LOWENSTEIN & WEATHERWAX LLP weatherwax@glwllp.com hendifar@glwllp.com Copy with citationCopy as parenthetical citation