Ex Parte BrubakerDownload PDFPatent Trial and Appeal BoardAug 28, 201712610219 (P.T.A.B. Aug. 28, 2017) Copy Citation United States Patent and Trademark Office UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O.Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 12/610,219 10/30/2009 Jeffrey T. Brubaker SNPS-1241 9967 36503 7590 PVF - SYNOPSYS, INC c/o PARK, VAUGHAN, FLEMING & DOWLER LLP 2820 FIFTH STREET DAVIS, CA 95618-7759 EXAMINER TRAPANESE, WILLIAM C ART UNIT PAPER NUMBER 2171 NOTIFICATION DATE DELIVERY MODE 08/30/2017 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): j eannie @parklegal. com wendy@parklegal.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte JEFFREY T. BRUBAKER Appeal 2017-003581 Application 12/610,2191 Technology Center 2100 Before CARLA M. KRIVAK, HUNG H. BUI, and JON M. JURGOVAN, Administrative Patent Judges. BUI, Administrative Patent Judge. DECISION ON APPEAL Appellant seeks our review under 35 U.S.C. § 134(a) from the Examiner’s Final Rejection of claims 24—26, which are all the claims pending in the application. We have jurisdiction under 35 U.S.C. § 6(b). We AFFIRM.2 1 According to Appellant, the real party in interest is Synopsys, Inc. App. Br. 3. 2 Our Decision refers to Appellant’s Appeal Brief (“App. Br.”) filed April 6, 2016; Reply Brief (“Reply Br.”) filed January 3, 2017; Examiner’s Answer (“Ans.”) mailed November 2, 2016; Final Office Action (“Final Act.”) mailed October 7, 2015; and original Specification (“Spec.”) filed October 30, 2009. Appeal 2017-003581 Application 12/610,219 STATEMENT OF THE CASE Appellant’s Invention Appellant’s invention relates to a system and method, shown in Figure 2, for generating widgets (graphical interface elements) for use in a window in a graphical user interface (GUI) based on predefined widget rules. Spec. Ill, 52, 68; Title; Abstract. According to Appellant, a user is able to generate and modify a graphical interface element without knowing details of how the element is constructed, so that the user may focus on high-level aspects of what is to be presented in the GUI. Spec. 36, 59, 68; Abstract. Different types of widgets/GUI elements (e.g., radio buttons, tabbed and notebook layouts, check boxes, line edits, combo boxes, and group boxes) are dynamically generated based on a description of a non-graphical object and the object’s attributes. Spec. Tflf 7, 8, 11, 36, 52, 67, 71, 79; Abstract. The non-graphical object may be a data object associated with an executable procedure in computer software. Spec. 7, 11, 52, 53, 60. Attributes associated with a non-graphical object may describe the object’s data model (such as the model’s types and restrictions), restrictions on a range of possible values, valid discrete values associated with the object, and read only restrictions. Spec. 6, 36, 52, 72. Figure 2 shows a method 200 for presenting a window in a GUI, as reproduced below. 2 Appeal 2017-003581 Application 12/610,219 METHOD 200 FIG. 2 Figure 2 depicts a method 200 for presenting a window in a GUI. Spec. 1117, 52. As shown in Appellant’s Figure 2, a computer system receives a description of a non-graphical object and associated attributes from a user (210), generates widgets/GUI elements based the description (212), arranges the generated widgets in a window based on layout rules (216), and presents the window in a graphical user interface (222). Spec. 1 52. A widget controller (controller object) may be created in association with the widget (214), to keep the non-graphical object and the widget synchronized in accordance with dynamic updates to the widget or to the object’s attributes. Spec. 1153, 54, 63,68. 3 Appeal 2017-003581 Application 12/610,219 Claims 24, 25, and 26 are independent. Representative claim 24 is reproduced below with disputed limitations in italics: 24. A computer-implemented method, comprising: receiving a non-graphical object having multiple attributes; dynamically generating one or more graphical interface elements based on one or more attributes of the non-graphical object; dynamically generating a controller object to keep the non-graphical object and the one or more graphical interface elements synchronized; arranging the one or more graphical interface elements in a graphical user interface (GUI) based at least on a type of each graphical interface element; detecting, by the controller object, an update to an attribute of the non-graphical object; modifying, by the controller object, a visual attribute of at least one graphical interface element based on the update to the attribute of the non-graphical object; and presenting the at least one graphical interface element with the modified visual attribute in the GUI. App. Br. 19-21 (Claims App’x). Examiner’s Rejection & Reference Claims 24—26 stand rejected under 35 U.S.C. § 102(b) as being anticipated by Scherpa et al. (US 2010/0275141 Al; published Oct. 28, 2010; “Scherpa”). Final Act. 2-3. Issue on Appeal Based on Appellant’s arguments, the dispositive issue on appeal is whether Scherpa teaches the following limitations: 4 Appeal 2017-003581 Application 12/610,219 (i) the disputed “non-graphical object” limitation (“receiving a non-graphical object having multiple attributes”), as recited in Appellant’s independent claims 24—26, (ii) the disputed “graphical interface element[]” limitation (“dynamically generating one or more graphical interface elements based on one or more attributes of the non-graphical object”), as recited in Appellant’s independent claims 24—26, and (iii) the disputed “controller object” limitation (“dynamically generating a controller object to keep the non-graphical object and the one or more graphical interface elements synchronized; . . . [and] modifying, by the controller object, a visual attribute of at least one graphical interface element based on the update to the attribute of the non-graphical object”), as recited in Appellant’s independent claims 24—26. App. Br. 10-17; Reply Br. 5—12. ANALYSIS With respect to independent claim 24, the Examiner finds Scherpa’s user input teaches Appellant’s claimed non-graphical object, and Scherpa’s inputted user ratings teach attributes of the non-graphical object. Ans. 2—3 (citing Scherpa ]ff[ 3—5). The Examiner then finds Scherpa’s displayed avatar teaches a graphical interface element that is dynamically generated based on the attributes (inputted user ratings) of the non-graphical object (user input) because the user ratings (attributes) dynamically alter the appearance of the avatar on the screen. Ans. 3 (citing Scherpa 13); Final Act. 2 (citing Scherpa ^fl[ 40-41). The Examiner additionally finds Scherpa’s avatar process is a controller object dynamically generated by a server or client application to keep the non-graphical object (user input) and 5 Appeal 2017-003581 Application 12/610,219 the graphical interface element (displayed avatar) synchronized. Ans. 3 (citing Scherpa 128); Final Act. 2—3 (citing Scherpa 13, Fig. 5). Scherpa’s Figure 5 is reproduced below with additional markings for illustration. HONEST' : >:: :1: 1: . \: ; :K: 152 153y _X_ "N? biv Avatar ( USE R}■RATINGS TX-j 154 J / ; ^ -1 K.i.tAiNiy ;;>% 200 \ __________ k._______________ ;-;y ; / User 46 repeatedly assured A 158 152 260 / the. taxpayers of Springtime §R*■...v that t e User 46 nor to be trusted! A It" A Verbosity 10% 166 '’A Temperament 50 % Zi 2 k -Rate this Input: yVj V Ak A A: V.A : '< 68 S Hes-th 90 % ... N. . Scherpa’s Figure 5 shows a user interface rendered by an application executing an avatar process, the avatar process displaying an avatar 150 with features (e.g., hair feature 152, eye feature 154) that are modified based on user input (provided via comment box 200, rating box 250, and rating selector 252) regarding honesty, verbosity, temperament, and health of a person represented by avatar 150. See Scherpa ]Hf 3, 17, 48; Abstract. Appellant disputes the Examiner’s factual findings regarding Scherpa. In particular, Appellant contends the Examiner erred in finding Scherpa discloses the “non-graphical object” limitation (i.e., “receiving a non- graphical object having multiple attributes”) recited in claim 24. App. Br. 6 Appeal 2017-003581 Application 12/610,219 10—13; Reply Br. 5—12. Appellant argues Scherpa does not disclose the claimed ‘“non-graphical object’ [which] refers to a software entity (e.g., a non-graphical data structure) in a computer system”; rather, Scherpa merely discloses “user inputs . . . [that] correspond to attributes of a user/avatar, but the user and the avatar are both graphical objects.” App. Br. 10, 12—13; Reply Br. 7—8 (citing Scherpa H 31, 44, 45, Fig. 1). Appellant also argues Scherpa does not disclose the “graphical interface element” limitation because Scherpa does not disclose the claimed attributes and “does not dynamically generate a graphical element; instead, Scherpa merely modifies an already existing graphical element.” Reply Br. 8—11 (citing Scherpa 40, 46); App. Br. 10, 17. Lastly, Appellant argues Scherpa does not disclose the “controller object” limitation because Scherpa does not teach a “controller object [that] comes into existence after (1) the attributes of the non-graphical object are received, and (2) graphical interface elements are dynamically generated based on the attributes of the non-graphical object.” Reply Br. 11; App. Br. 10, 14—15. We do not find Appellant’s arguments persuasive. Rather, we find the Examiner has provided a comprehensive response to Appellant’s arguments supported by a preponderance of evidence. Ans. 2—3. As such, we adopt the Examiner’s findings and explanations provided therein. Id. For additional emphasis, we note claim terms are given their broadest reasonable interpretation consistent with the Specification. In re Am. Acad, of Sci. Tech Ctr., 367 F.3d 1359, 1364 (Fed. Cir. 2004). Under the broadest reasonable interpretation, claim terms are given their ordinary and customary meaning, as would be understood by one of ordinary skill in the art in the context of 7 Appeal 2017-003581 Application 12/610,219 the entire disclosure. In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007). Appellant’s Specification does not provide an explicit and exclusive definition of the claimed term “non-graphical object.” Rather, Appellant’s Specification provides discussion of non-limiting examples of “non- graphical objects,” which include “a given object (such as a data object)” or “an abstract object (which has an ‘internal’ presentation of what the user wants)” having “model data” and “information ... to generate widgets,” where the non-graphical object may be “associated with an executable procedure in the computer software,” and may be “manipulated directly by [a] batch process” or “based on user input.” See Spec. ^fl[ 7, 153, 55, 60, 67, 70,71, 107-109. Based on Appellant’s Specification, the Examiner has broadly interpreted the term “non-graphical object” as encompassing Scherpa’s “inputs from users” because “the inputs [from users] are non-graphical objects (i.e. they are not displayed on the screen) that contain attributes (i.e. user ratings).” Ans. 2—3 (citing Scherpa ]ff[ 3—5). We find the Examiner’s interpretation reasonable and consistent with Appellant’s Specification. Scherpa’s user input is information provided by on-screen buttons, or by “user input data . .. [recorded] in a separately- maintained file” and by “one or more data systems/databases” such as “a human resources database, a news database, or any other data systems/databases that may retain information relevant to attributes of an avatar.” See Scherpa 145; see also Scherpa ^fl[ 3—5, 31; Ans. 2—3. Thus, Scherpa’s user input includes a computer’s expression of user-provided data, the user input instructing a processor to specify an avatar’s features. 8 Appeal 2017-003581 Application 12/610,219 See Scherpa || 3, 31,40-41. Scherpa’s user input—including data instructing a processor to specify avatar features—is commensurate with the broad description of “non-graphical object” in Appellant’s Specification. Appellant’s argument that Scherpa’s user inputs “correspond to attributes of a user/avatar, but the user and the avatar are both graphical objects” (Reply Br. 7) does not present persuasive evidence or reasoning rebutting the Examiner’s finding that Scherpa’s “inputs [from users] are non-graphical objects (i.e. they are not displayed on the screen).” Ans. 2—3. Appellant further argues Scherpa does not teach attributes of a non- graphical object as claimed, but only teaches attributes of graphical objects or human beings. Reply Br. 7—8; App. Br. 11—13. Particularly, Appellant argues “[t]here is nothing in Scherpa that uses the term ‘attribute’ in the context of an attribute of a user input”; rather, Scherpa’s “attributes such as ‘honesty, verbosity, temperament, and health’. ... are for an actual human being.” Reply Br. 7. Appellant’s arguments do not address the Examiner’s specific findings that Scherpa’s user ratings are the claimed attributes of a non-graphical object (user input) because Scherpa’s “inputs [from user] contain ratings of a user and are a non-graphical object that contains attributes that change the appearance of the avatar.” Ans. 3. We agree with the Examiner’s findings. Scherpa’s user ratings characterize the user input (non-graphical object). Ans. 3. Scherpa’s user ratings—which include information to represent an avatar—are commensurate with the broad description of “attributes” in Appellant’s Specification.3 Ans. 3; see Scherpa || 3? 4, 48, 49, Title; Spec. || 6, 56, 68, 107. 3 Appellant’s Specification describes “attributes [that] include information other than that associated with a visual presentation of the object (i.e., the 9 Appeal 2017-003581 Application 12/610,219 Accordingly, we agree with the Examiner that Scherpa teaches the “non-graphical object” limitation (i.e., “receiving anon-graphical object having multiple attributes”) recited in claim 24. We are also not persuaded by Appellant’s argument that Scherpa does not teach the “graphical interface element” limitation (i.e., “dynamically generating one or more graphical interface elements based on one or more attributes of the non-graphical object”) recited in claim 24. As discussed supra, we agree with the Examiner that Scherpa teaches “attributes of the non-graphical object.” We also agree with the Examiner’s findings that Scherpa’s “avatar is a graphical interface element as it is displayed on the screen,” and “Scherpa teaches using [the user] rating to dynamically alter the appearance of [the] avatar,” the avatar’s “alteration [being] interpreted as a form of dynamic generation.” Ans. 3 (citing Scherpa 13). Appellant’s argument that “Scherpa does not dynamically generate a graphical element” but “merely modifies an already existing graphical elemenf1 (see Reply Br. 8—9 (emphasis added)) is also not commensurate with the scope of claim 24. The claimed “dynamically generating” does not exclude modifying an already existing graphical element (e.g., an avatar such as that shown in user may specify the data model (such as types, restrictions, etc) and very little about the user interface),” “attributes [that] may include runtime core objects in graphics module 832 that represent the model and the view objects, such as widgets,” and attributes determining “resizing at least some of the widgets . . . based on changes to the associated attributes during execution of the computer software, thereby dynamically resizing the window.” Spec. ^fl[ 6, 56, 107 (emphases added). The Specification provides that “user software code can interact with a disconnected object by changing its attributes, and the corresponding widget builder(s) may monitor these changes and updates to the widget.” Spec. | 68 (emphasis added). 10 Appeal 2017-003581 Application 12/610,219 Scherpa’s Figure 3) to produce a changed graphical element (e.g., a changed avatar such as that shown in Scherpa’s Figure 8). Thus, Scherpa’s modified avatar teaches a dynamically generated graphical interface element, as claimed. Ans. 3. As to Appellant’s argument that the ‘“avatar process’ in Scherpa cannot be equated with the controller object” because the “controller object comes into existence after” the receiving and graphical element generating steps (see Reply Br. 10—11), we note the broadly recited limitations in claim 24 do not exclude pre-existing code for the controller object. Ans. 3. Appellant’s Specification also does not limit the term “dynamically generating” to “creating] on-the-fly during runtime” as Appellant asserts. Reply Br. 5; see Spec. 8, 36, 54, 63, 66—68, 71. As recognized by the Examiner, Scherpa’s “avatar process ... is merely code until executed by a client application,” and “[ujpon execution, the avatar, graphical objects, system for modifying the avatar, etc. that are controlled by the process are created and don’t exist prior to execution.” Ans. 3 (citing Scherpa 128). Thus, Scherpa dynamically generates a controller object by generating the avatar process software and executing the software by a client or server application. See Scherpa Tflf 28, 29; Ans. 3. Scherpa’s executing the avatar process software keeps the non-graphical object (user input) and the graphical interface element (displayed avatar) synchronized, and modifies a visual attribute of the graphical interface element, as required by claim 24. Ans. 3. We further note that “no disclosure of similar advantages in Scherpa [regarding a dynamically generated controller object]” (see Reply Br. 11—12 (emphasis added)) is irrelevant to the anticipation analysis. 11 Appeal 2017-003581 Application 12/610,219 For the reasons set forth above, Appellant has not persuaded us of error in the Examiner’s rejection of independent claim 24. Accordingly, we sustain the Examiner’s anticipation rejection of independent claim 24. For the same reasons as claim 24, we sustain the anticipation rejection of independent claims 25 and 26, which recite the aforementioned contested “non-graphical object,” “graphical interface element,” and “controller object” limitations using commensurate language and are argued therewith. CONCLUSION On the record before us, we conclude Appellant has not demonstrated the Examiner erred in rejecting claims 24—26 under 35 U.S.C. § 102(b). DECISION As such, we AFFIRM the Examiner’s Final Rejection of claims 24— 26. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l)(iv). AFFIRMED 12 Copy with citationCopy as parenthetical citation