VMware, Inc.Download PDFPatent Trials and Appeals BoardMar 14, 20222021001077 (P.T.A.B. Mar. 14, 2022) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 14/668,776 03/25/2015 Salim AbiEzzi C020.02 1052 151795 7590 03/14/2022 FISH & RICHARDSON P.C. (VMware) P.O. BOX 1022 MINNEAPOLIS, MN 55440-1022 EXAMINER TAN, ALVIN H ART UNIT PAPER NUMBER 2178 NOTIFICATION DATE DELIVERY MODE 03/14/2022 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): PATDOCTC@fr.com ipadmin@vmware.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte SALIM ABIEZZI Appeal 2021-001077 Application 14/668,776 Technology Center 2100 Before MICHAEL J. STRAUSS, CHRISTA P. ZADO, and DAVID J. CUTITTA II, Administrative Patent Judges. ZADO, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 1-20, all the claims pending in the present application. We have jurisdiction under 35 U.S.C. § 6(b). We AFFIRM. 1 “Appellant” refers to “applicant” as defined in 37 C.F.R. § 1.42(a). Appellant identifies the real party in interest as VMware, Inc. Appeal Br. 1. Appeal 2021-001077 Application 14/668,776 2 CLAIMED SUBJECT MATTER The instant application relates to remote display protocols that allow a user device (e.g., client 150) to interact with a remote application running on a remote server (e.g., remote application system 100). Spec. ¶¶ 1-2. More specifically, the remote protocol facilitates: 1) transferring display data, for presentation on the user device, from the remote server to the user device, and 2) transferring user input, at the display device, from the user device to the remote server. Id. The user input can be used by the remote server to generate display data updates, wherein the updates are transferred to the user device to update its display. Id. The Specification further describes a protocol that allows for a user device to update its display in response to a behavior without the need to receive updated display data from the remote server. Id. ¶¶ 23-25. In an exemplary embodiment involving a slide presentation application running on a remote server-i.e., application 122-the application provides user interface data to the user device, wherein user interface data includes, e.g., image data for photos to be displayed during the slide show. Spec. ¶ 29. Application 122 also provides reactive behavior data that includes information defining a user interface function for transitioning between photos upon detecting a trigger condition, as well as the trigger condition itself (e.g., such as a time or event based trigger). Id. ¶¶ 27, 29. The Specification explains that when the trigger condition is satisfied (e.g., when a specified number of seconds have elapsed in the case of a time based trigger), the user device samples the user interface function to generate user interface data necessary to transition to the next photo in the slide show, and the next photo is displayed on the user device. Id. ¶ 29. Appeal 2021-001077 Application 14/668,776 3 Claim 1, reproduced below (with language at issue italicized), illustrates the claimed subject matter: 1. A method comprising: initiating, by a user device, a remote session with a remote application system that allows user interfaces generated by an application executing on the remote application system to be presented on the user device and user events associated with the presented user interfaces to be provided as input to the application; during the remote session with the remote application system: receiving, by the user device, reactive behavior data, wherein the reactive behavior data defines one or more behaviors to be performed by the user device and a respective trigger condition for each of the behaviors such that in response to a future occurrence of the respective trigger condition the corresponding one or more behaviors are performed by the user device, and wherein each behavior is associated with a respective continuous user interface function; determining, by the user device, that a particular trigger condition of the one or more trigger conditions has been satisfied; and in response to determining that the particular trigger condition has been satisfied: providing information about the trigger condition to the remote application system, and performing the one or more corresponding behaviors associated with the particular trigger condition, wherein performing the one or more corresponding behaviors includes: generating, by the user device, user interface updates, and wherein generating user interface updates includes sampling from the continuous user interface function, for the behavior associated with the particular trigger condition, that specifies Appeal 2021-001077 Application 14/668,776 4 user interface data to generate at each time step over a particular time period to determine user interface data for a particular time step, and updating a user interface generated by the application being displayed by the user device using the user interface updates without receiving updated user interface data from the remote application system. Appeal Br. 10-11 (Claims App.). REFERENCES The prior art relied upon by the Examiner is: Name Reference Date Beveridge US 9,158,434 B2 Oct. 13, 2015 Abdo US 9,167,020 B2 Oct. 20, 2015 Helter US 9,565,227 B1 Feb. 7, 2017 REJECTIONS The claims stand rejected as follows: Claims Rejected 35 U.S.C. § Reference(s) 1, 3, 4, 8, 9, 11, 12, 16, 17, 19, 20 102 Beveridge 5-7, 13-15 103 Beveridge, Helter 2, 10, 18 103 Beveridge, Abdo OPINION Appellant appeals the Examiner’s rejections of claims 1-20. Appellant provides arguments for claim 1, and submits that the remaining Appeal 2021-001077 Application 14/668,776 5 claims are allowable for the same reasons as claim 1. Appeal Br. 4-8. For reasons discussed below, we sustain the Examiner’s rejections. The Examiner rejects claim 1 under § 102 as anticipated by Beveridge. Final Act. 2-5. Pertinent to this appeal, claim 1 recites: receiving, by the user device, reactive behavior data, wherein the reactive behavior data defines one or more behaviors to be performed by the user device and a respective trigger condition for each of the behaviors such that in response to a future occurrence of the respective trigger condition the corresponding one or more behaviors are performed by the user device, and wherein each behavior is associated with a respective continuous user interface function (“continuous user interface” limitation); and generating, by the user device, user interface updates, and wherein generating user interface updates includes sampling from the continuous user interface function, for the behavior associated with the particular trigger condition, that specifies user interface data to generate at each time step over a particular time period to determine user interface data for a particular time step (“sampling” limitation). Appeal Br. 10-11 (Claims App.). The Examiner finds that Beveridge discloses these limitations. Final Act. 3-5. Like the instant application, Beveridge relates to virtualization that allows a user/client device (e.g., a tablet) to display data (e.g., photos or animations) from an application running on a remote host machine. Beveridge [57]. Virtualization also allows the user of the user device to interact with the display (e.g., select menu options). Id. Beveridge explains that virtualization challenges arise when a client device and host machine each have a different native graphical user interface (“GUI”). Id. at 1:45-56. Appeal 2021-001077 Application 14/668,776 6 For example, a client device may have a touch screen GUI, whereas a remote host machine may use a point-and-click driven interface such as a mouse. Id. Beveridge describes another virtualization challenge, namely that because a client device communicates with a host machine over a network, the virtualized environment may be subject to network latency, which may result in a jittery user experience. Id. To address these challenges, Beveridge discloses a virtualization environment as follows. Client device 108 communicates over network 120 with virtual machine (“VM”) 157. Beveridge Fig. 2. Client device 108 includes virtual desktop infrastructure (“VDI”) client 110 and user interface virtualization (“UIV”) client 202. Id. VM 157 includes VDI host agent 200 and UIV agent 204. Id. In pertinent part, VM 157 transmits user interface (“UI”) metadata to client device 110, wherein the UI metadata describes UI elements such as windows, buttons, menus, dialog boxes, lists, menus, scroll bars, title bars, status bars, size grips, toolbars, list view controls, dropdown lists, and input carets. Id. at 6:32-43. UIV client 202 uses the UI metadata to construct a native client UI widget having the same functionality as the virtual machine UI widget, but that is better suited to the client device (e.g., is more friendly to a touch screen environment). Id. at 7:57-67. To address network latency, Beveridge discloses a technique in which VDI client 110 can modify native GUI elements using UI metadata without having to make repeated requests to VM 157. Beveridge 13:2-13. For example, in a word processing environment, selecting the menu item “Edit” causes a dropdown menu to appear, wherein the dropdown menu includes additional options, such as “Paste Special,” etc. Id. at 12:52-60. Beveridge explains that in earlier systems, upon a user at the client selecting “Edit,” the Appeal 2021-001077 Application 14/668,776 7 client did not already have UI information for the dropdown menu, but rather would have to make a request to the virtual machine for UI information to display the dropdown menu. Id. at 12:60-13:2. The network latency involved in transmitting a request to the virtual machine and waiting for a response resulted in a “laggy” user experience. Id. at 13:2-6. In contrast, Beveridge’s VDI client 110 receives sufficient UI metadata at the outset to construct a native GUI widget with all of the functionality of the virtual machine UI widget (i.e., the native GUI widget includes the dropdown menu), thereby eliminating the need to request UI information for the dropdown menu. Id. at 13:2-13. The Examiner finds that Beveridge’s client device (i.e., claimed “user device”) receives the claimed “reactive behavior data,” because Beveridge’s client device receives, e.g., UI metadata that includes data on how to construct a native GUI, as well as data for display, such as photos and animations. Final Act. 3-4 (citing Beveridge 6:32-7:33, 11:56-57, 12:8- 15). Appellant argues that the rejection fails to show that the behaviors in Beveridge are associated with a respective continuous user interface function. Appeal Br. 7-8. We note that the term “continuous interface function” is used in the Specification as follows: As used in this specification, a behavior is a user interface function that is continuous over time. That is, the user interface function specifies user interface data that should be generated at each time step of a particular time period. Generally, the user interface data is time varying, i.e., the user interface data generated by sampling from the function will be different at each time step of the particular time period. In some cases, the user interface data is predetermined, e.g., an animated image that updates at specified intervals. In some other cases, the user interface data may depend on inputs to the user interface Appeal 2021-001077 Application 14/668,776 8 function. For example, a behavior may specify a transition over time between one image and another image in a slideshow, with the user interface data being generated as part of the behavior being dependent on the pixel data of the two images. Spec. ¶ 24 (emphasis added). In the Answer, the Examiner clarifies that, contrary to Appellant’s argument, Beveridge’s animation sequence discloses claim 1’s requirement that “each behavior is associated with a respective continuous user interface function.” Ans. 4-5. Beveridge discloses, in pertinent part: In one embodiment, UIV client 202 may generate and display a native animation selection GUI element based on UIV profile 144 associated with an application having an animation selection GUI. In some cases, animation selection may involve cycling through available animation options and waiting until a sample animation clip plays for each option. In a VDI environment, display protocol latency and bandwidth can make this task a jittery and frustration experience. In one embodiment, UIV client 202 may activate a context-sensitive UIV button on client GUI 260 that, when selected, generates and displays a native animation selection GUI element having a callout that includes a pre-defined number of animations presented locally on VDI client 110, such as “Fades”, “Dissolves,” etc. By creating a compact local animation, the native GUI element enables a user to preview and make an animation selection without taxing the remote session’s CPU and generating excessive display protocol traffic. Furthermore, embodiments of the present invention prevent poorly-performing and higher latency networks from having a substantial impact on user experience during animation selection. Beveridge 15:18-37 (emphasis added). The Examiner notes that the Specification describes a “continuous user interface function” as a function that specifies user interface data that should be generated at each time step of a particular time step. Ans. 4 (citing Spec. ¶ 24). The Examiner finds, Appeal 2021-001077 Application 14/668,776 9 therefore, that cycling through animation selections, as disclosed in Beveridge, is a continuous user interface function. Id. at 4-5. The record supports the Examiner’s interpretation of an animation sequence as being a continuous user interface function. As shown above, the Specification provides an exemplary continuous user interface function as one in which “a behavior may specify a transition over time between one image and another image in a slideshow, with the user interface data being generated as part of the behavior being dependent on the pixel data of the two images.” Spec. ¶ 24. Similarly, Beveridge discloses cycling through available animation options and waiting for an animation clip to play for each option. Beveridge 15:18-37. In other words, in both the Specification’s example and Beveridge’s disclosure, a photo or animation clip, respectively, is displayed/played during each time step. As found by the Examiner, Beveridge describes UI metadata that includes semantic elements for controls (e.g., behavior data), wherein the controls include animation selections triggered by user input, and provide specified user interface data generated at each time step of a time period (e.g., behaviors associated with a continuous user interface function). Ans. 4-5. Appellant has not shown any reversible error in the Examiner’s findings. Appellant essentially argues, incorrectly, that the Examiner does not identify where Beveridge discloses certain claim recitations of the “continuous user interface” limitation. Specifically, Appellant asserts that in the Answer the Examiner “concludes, without evidence, that an animation UI element [in] Beveridge” discloses the limitation reciting that the “behavior is associated with a respective continuous user interface function.” Reply Br. 3. According to Appellant, the Examiner cites to portions of Appeal 2021-001077 Application 14/668,776 10 Beveridge “up to the point of discussing the actual claim limitations, at which point the Examiner simply concludes” the claim language is disclosed. Id. Appellant submits that the Examiner has failed to identify portions of Beveridge that disclose “reactive behavior data” that “defines one or more [trigger] behaviors to be performed by the user device and a respective trigger condition for each of the behaviors” where each behavior “is associated with a respective continuous user interface function.” Id. Contrary to Appellant’s arguments that the Examiner’s findings fail to identify portions of Beveridge that disclose the claim elements at issue, the Examiner identifies where Beveridge is found to disclose the claim elements and provides an explanation as to why, as we discussed above. Ans. 3-5 (e.g., finding UI semantic metadata information discloses “reactive behavior data”; finding UI metadata generated controls allowing for animation sequences are controlled by user input (e.g., the reactive behavior data defines one or more trigger behaviors to be performed by the user device and a respective trigger condition for each of the behaviors); finding the controls for the animations specify user interface data generated at each step of a time period (e.g., the behaviors are associated with a respective user interface function)). Appellant does not acknowledge these specific findings by the Examiner, much less address why they are in error. For the foregoing reasons, Appellant’s arguments are insufficient to demonstrate Examiner error as to the “continuous user interface” limitation. Appellant also argues that the Examiner has not shown Beveridge discloses the “sampling” limitation. Appeal Br. 5-7. According to Appellant, Beveridge does not disclose sampling from a user interface function (id. at 5-6) “that specifies user interface data to generate at each Appeal 2021-001077 Application 14/668,776 11 time step over a particular time period to determine user interface data for a particular time step,” as recited in claim 1. Appellant acknowledges that the rejection cites to Beveridge’s disclosure of cycling through animation options, but Appellant argues that the Examiner does not reference “sampling” in the rejection. Id. at 7. In the Answer, the Examiner clarifies that cycling through animation options over time, as disclosed in Beveridge, involves sampling from a continuous user interface function to generate user interface updates over a particular time period. Ans. 4 (citing Beveridge 15:18-23). This is a reasonable interpretation of Beveridge’s disclosure. As we discussed above, the Specification describes a behavior specifying transitioning between one image and another image in a slide show as an example of time varying user interface data, which the Specification refers to as “user interface data generated by sampling from the [user interface] function” that “will be different at each time step of the particular time period.” Spec. ¶ 24. Similarly, Beveridge discloses behavior specifying transitioning between one animation sequence and another animation sequence in order to cycle from one animation sequence option to another. Ans. 4; Beveridge 15:18-37. Therefore, Beveridge reasonably discloses sampling from a continuous interface function, for the behavior associated with a particular trigger condition, that specifies user interface data to generate at each time step over a particular time period to determine user interface data for a particular time step. Appellant’s arguments in the Reply Brief are not sufficient to rebut the Examiner’s findings. In particular, Appellant misapprehends the Examiner’s findings, and therefore, fails to address them. The portion of Appeal 2021-001077 Application 14/668,776 12 Beveridge cited by the Examiner, which describes cycling through animation options, refers to each animation as a sample animation: “In some cases, animation selection may involve cycling through available animation options and waiting until a sample animation clip plays for each option.” Beveridge 15:21-23. Appellant argues that this use of the word sample in Beveridge serves as the basis of the Examiner’s rejection, and that such use is different from that in the claim. Reply Br. 3. However, the Examiner’s rejection does not equate the word sample in Beveridge’s “sample animation clip” with the word “sample” in claim 1. Ans. 4. Rather, as we discussed above, for disclosure of sampling from the continuous user interface function, the Examiner relies on the feature of cycling through animation options over time: “cycling though animation options over time would involve sampling from a continuous user interface function to generate user interface updates over a particular time period.” Ans. 4 (citing Beveridge 15:18-23). For the foregoing reasons, Appellant has not demonstrated Examiner error in rejecting claim 1, and we, therefore, affirm the rejection of claim 1. Furthermore, because the claims are argued together, we also affirm the rejections of claims 2-20. CONCLUSION The Examiner’s rejections are affirmed. Appeal 2021-001077 Application 14/668,776 13 DECISION SUMMARY In summary: Claims Rejected 35 U.S.C. § Reference(s) Affirmed Reversed 1, 3, 4, 8, 9, 11, 12, 16, 17, 19, 20 102 Beveridge 1, 3, 4, 8, 9, 11, 12, 16, 17, 19, 20 5-7, 13-15 103 Beveridge, Helter 5-7, 13-15 2, 10, 18 103 Beveridge, Abdo 2, 10, 18 Overall Outcome 1-20 TIME PERIOD FOR RESPONSE No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation