Cosentino, Daniel L. et al.Download PDFPatent Trials and Appeals BoardOct 28, 201913920718 - (D) (P.T.A.B. Oct. 28, 2019) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/920,718 06/18/2013 Daniel L. Cosentino C00006680USU2/ LG10574.082 8154 118252 7590 10/28/2019 Medtronic, Inc. 710 Medtronic Parkway MS LC340 Minneapolis, MN 55432 EXAMINER WILLIAMS, TERESA S ART UNIT PAPER NUMBER 3686 NOTIFICATION DATE DELIVERY MODE 10/28/2019 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): rs.patents.three@medtronic.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE __________ BEFORE THE PATENT TRIAL AND APPEAL BOARD __________ Ex parte DANIEL L. COSENTINO, CHRISTOPHER T. ABRAHAMSON, and KRISTIN N. PARROTT __________ Appeal 2018-007708 Application 13/920,7181 Technology Center 3600 __________ Before DONALD E. ADAMS, TAWEN CHANG, and RACHEL H. TOWNSEND, Administrative Patent Judges. TOWNSEND, Administrative Patent Judge. DECISION ON APPEAL This is an appeal under 35 U.S.C. § 134 involving claims to a system for improving the accuracy of patient data, which have been rejected as being directed to patent ineligible subject matter, failing to comply with the written description requirement of 35 U.S.C. § 112, and/or as being obvious. We have jurisdiction under 35 U.S.C. § 6(b). We affirm. 1 We use the word “Appellant” to refer to “Applicant” as defined in 37 C.F.R. § 1.42. Appellant Cardiocom, LLC identifies the real party in interest as Medtronic, Inc. (Appeal Br. 3, see also Filing Receipt dated June 8, 2013.) Appeal 2018-007708 Application 13/920,718 2 STATEMENT OF THE CASE Appellant’s invention is directed to a system for receiving patient data that is “verified” by a health care professional after having automatically been flagged and filtered by the system and which system thereafter transmits the verified data to a client. (See Spec. ¶¶ 2, 18.) Claims 11–15 and 17–30 are on appeal. Claim 11 is representative and reads as follows: 11. A system for improving the accuracy of patient data transmitted to a client, the system comprising: a remote patient monitoring apparatus; a central processing unit, the central processing unit including a flagging and filtration system and a database storing a predefined rule set, the flagging and filtration system in communication with the remote patient monitoring apparatus via a network; and a health care communication device, the health care communication device in communication with at least the flagging and filtration system, wherein the remote patient monitoring apparatus is configured to: collect patient data, the patient data comprising at least one patient measurement; and transmit the patient data to the flagging and filtration system, wherein the flagging and filtration system is configured to: receive the patient data from the remote patient monitoring apparatus; store the patient data in the database; categorize and flag the patient data based on the predefined rule set to create categorization and flagging information that specifies a level of accuracy for the patient data; present, on the health care communication device for verification by a health care professional, the patient data with the categorization and flagging information; Appeal 2018-007708 Application 13/920,718 3 update the patient data in the database to reflect changes made to the levels of accuracy specified by the categorization and flagging information during the verification by the health care professional; filter the patient data based on the levels of accuracy specified in the categorization and flagging information, including the changes made to the levels of accuracy, to create a portion of the updated patient data; and transmit the portion of the updated patient data to a client. (Appeal Br. 16.) The following grounds of rejection by the Examiner are before us on review: Claims 11–15 and 17–30 under 35 U.S.C. § 101 as being directed to patent-ineligible subject matter. Claims 25 and 29 under 35 U.S.C. § 112, first paragraph as lacking written description. Claims 11, 15, 23, 24, 26–28, and 30 under 35 U.S.C. § 103(a) as unpatentable over Knapp2 and Pratt.3 Claims 12–14 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Banet.4 Claims 17–20 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Papadopoulos.5 2 Knapp, US 6,278,999 B1, issued Aug. 21, 2001. 3 Pratt et al., US 2005/0192843 A1, published Sept. 1, 2005. 4 Banet et al., US2008/0097178 A1, published Apr. 24, 2008. 5 Papadopoulos et al., US 8,618,930 B2, issued Dec. 31, 2013. Appeal 2018-007708 Application 13/920,718 4 Claims 21 and 22 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Shusterman.6 Claims 25 and 29 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Muradia.7 DISCUSSION I Patent Ineligible Subject Matter The Examiner finds that “claims 11–15 and 17–30 are directed to the abstract idea of filtering and flagging collected patient data based on a predefined rule set [and] present[ing] the patient data with the flagging information for verification by a health care professional.” (Final Action 2.) The Examiner also finds that the claims “do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional computer elements, which are recited at a high level of generality, provide conventional computer functions that do not add meaningful limits to practicing the abstract idea.” (Id. at 2–3.) The Examiner explains that “the basic computer functions of filtering, flagging collected patient data, further to transmit, receive, store, categorize, create, present, update, and reflect changes are not effecting upon the components as ‘an improvement to computer systems’, but is rather performing basic data manipulating and filtering operations that any generic computer would be expected to do.” (Ans. 4.) Moreover, the Examiner notes that “[w]ith 6 Shusterman, US 7,801,591 B1, issued Sept. 21, 2010. 7 Muradia, US 2008/0077436 A1, published Mar. 27, 2008. Appeal 2018-007708 Application 13/920,718 5 regards to automated functions improving an automated computer system, some of the steps of the claimed invention, such as verifying patient data and updating the patient data specified by the categorization and flagging information during the verification, are actually carried out by a health care professional. Therefore, transformational steps of the claimed invention are carried out manually, not automatically.” (Id. at 4–5.) Appellant argues that “the claimed solution offers a specific, technical solution to problems with existing remote patient monitoring systems.” (Appeal Br. 9.) According to Appellant, “[w]hile it is true that a general purpose computer could be programmed in such a manner, once it has been so programmed it ceases to be a general purpose computer and becomes a specially-programmed computer.” (Id.) Appellant further notes that “the claimed solution is a hybrid solution in which categorization information is first automatically generated and then provided for review and editing by a health care professional ( e.g., a trained nurse). The edited categorization information is then used to filter the patient data before it is transmitted to a client.” (Id. at 10–11.) Appellant explains that the claimed invention “allow[s] healthcare providers to improve filtering by providing customized changes to the categorization and filtering information.” (Id. at 12.) ANALYSIS 35 U.S.C. § 101 defines patent eligible subject matter. The Supreme Court has carved out exceptions to what would otherwise appear to be within the literal scope of § 101. Alice Corp. v. CLS Bank Int’l, 573 U.S. 208, 216 (2014). One of these exceptions are claims “directed to” an abstract idea. Id. at 217. This appeal involves the abstract idea exception to patent eligibility under section 101. Appeal 2018-007708 Application 13/920,718 6 The Supreme Court has established a two-step framework for “distinguishing patents that claim laws of nature, natural phenomena, and abstract ideas from those that claim patent-eligible applications of those concepts.” Id. “First, we determine whether the claims at issue are directed to” a patent-ineligible concept. Id. If so, “we consider the elements of each claim both individually and ‘as an ordered combination’ to determine whether the additional elements ‘transform the nature of the claim’ into a patent-eligible application.” Id. (quoting Mayo Collaborative Servs. v. Prometheus Labs., Inc., 566 U.S. 66, 78–79 (2012)). The United States Patent and Trademark Office (PTO) issued the 2019 Revised Patent Subject Matter Eligibility Guidance (“Guidance”), indicating how the PTO would analyze patent eligibility under the Supreme Court’s two-step framework. 84 Fed. Reg. 50–57 (January 7, 2019). Applying the Guidance, we agree with the Examiner that the pending claims are directed to patent ineligible subject matter. STEP 2A, Prong One: Under the Guidance, in determining what concept a claim is “directed to” in step one of the Supreme Court’s two-step framework, we first look to whether the claim recites judicial exceptions, such as a mathematical concept (including mathematical relationships, mathematical formulas or equations, mathematical calculations), methods of organizing human activity (such as advertising, marketing or sales activities or behaviors, and business relations, as well as managing interactions between people), and/or a mental process (concepts performed in the mind, including an observation, evaluation, judgment or opinion). Guidance, 84 Fed. Reg. at 52, 54 (Step 2A, Prong One). Concepts performed in the mind include performance in Appeal 2018-007708 Application 13/920,718 7 the mind (including using a pen and paper) but for the recitation of generic computer components. Id. at 52 n.14. These types of judicial exceptions are deemed abstract ideas. Id. at 51–52. Claim 11 recites a system that includes a remote patient monitoring apparatus that is configured to “collect patient data” and “transmit the patient data to the flagging and filtration system.” The system of claim 11 also includes a flagging and filtration system that is part of a central processing unit that is configured to “receive the patient data from the remote patient monitoring apparatus,” “store the patient data,” “categorize and flag the patient data. . . creat[ing] categorization and flagging information that specifies a level of accuracy for the patient data,” “present . . . the patient data [along] with the categorization and flagging information” to a “health care communication device for verification by a health care professional,” “update the patient data in the database to reflect changes made to the levels of accuracy . . . during verification by the health care professional,” “filter the patient data based on the levels of accuracy . . . to create a portion of the updated patient data,” and “transmit the portion of the updated patient data to a client.” Thus, the system of claim 11 recites computer components that are configured to receive, analyze, modify, and transmit data. Each of the activities that the flagging and filtration portion of the CPU is configured to perform involve managing collected patient data, including the provision of that data to a healthcare communication device for verification of the accuracy of the data. The categorizing and flagging as well as updating and filtering are activities that a person can do mentally or with a pen and paper, with the exception of the recitation of generic Appeal 2018-007708 Application 13/920,718 8 computer implementation. Like the data collection and analysis in Electric Power Group, claim 11 recites mental processes, which is one of the categories of subject matter deemed abstract under the Guidance. Electric Power Group, LLC, v. Alstom S.A., 830 F.3d 1350, 1353–54 (Fed. Cir. 2016) (“[W]e have treated analyzing information by steps people go through in their minds, or by mathematical algorithms, without more, as essentially mental processes within the abstract-idea category”); id. at 1353 (“[W]e have treated collected information, including when limited to particular content (which does not change its character as information), as within the realm of abstract ideas.”). STEP 2A, Prong Two: Having made the determination that claim 11 recites abstract ideas, under the Guidance, we next examine whether there are additional elements beyond the mental processes that integrate the abstract idea into a practical application. Under the Guidance, this is referred to as the “Prong Two” inquiry under “Step 2A.” Guidance, 84 Fed. Reg. at 54–55. That is, under the Prong Two analysis we look to whether the claim as a whole “appl[ies], rel[ies] on, or use[s] the judicial exception in a manner that imposes a meaningful limit on the judicial exception.” Id. One of the “examples in which a judicial exception has not been integrated into a practical application” is when “[a]n additional element . . . merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea.” Id. at 55. See also buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1354 (Fed. Cir. 2015) (stating that “[t]he Court in Alice made clear that a claim directed to an Appeal 2018-007708 Application 13/920,718 9 abstract idea does not move into section 101 eligibility territory by ‘merely requir[ing] generic computer implementation’”) (alteration in original). The additional elements of a remote patient monitoring apparatus, a health care communication device, a CPU, and a database are not sufficient recitations to integrate the judicial exceptions into a practical application. In the first place, Appellant’s Specification discloses that these elements are generic computer elements. (See Spec. ¶¶ 22 and 29.) As our reviewing court has observed, “after Alice, there can remain no doubt: recitation of generic computer limitations does not make an otherwise ineligible claim patent-eligible.” DDR Holdings v. Hotels.com, 773 F.3d 1245, 1256 (Fed. Cir. 2014) (citing Alice, 573 U.S. at 223). We fail to see how the generic recitations of these most basic computer components and/or of a system including them so integrates the judicial exception so as to “impose[] a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception.” Guidance, 84 Fed. Reg. at 53. Second, the fact that the remote patient monitoring apparatus is configured to collect data that is then transmitted to the flagging and filtration system of the CPU is simply a system element that is involved in pre-solution activity. CyberSource v. Retail Decisions, Inc., 654 F.3d 1366, 1375 (Fed. Cir. 2011) (mere data gathering does not make a claim patent- eligible). As noted above, claim 11 recites a system for collecting information remotely and processing that information via the flagging and filtration system of the CPU for transmission to a third party. Appellants assert that the processing combines both automated tasks of categorizing the Appeal 2018-007708 Application 13/920,718 10 information and human verification to ensure accuracy before providing the information to a third party. (See, e.g., Response dated Feb. 28, 2017 at 9; Appeal Br. 5 (noting that the flagging and filtration system of the computer system “flag[s], and filter[s] patient data according to the level of accuracy for the patient data” . . . [and] allow[s] a set of trained healthcare professionals to review the data before it is sent to the client.”).) According to Appellant, this system that provides for “verification from healthcare professionals” improves automated categorization by “improv[ing] the accuracy of data after it is collected but before it is sent to the client (e.g., to a hospital).” (Appeal Br. 10.) Notably, Appellant does not assert, nor do we determine any, improvement to the technology to identify or flag errors or to the provision of updated data that has a particular level of accuracy to a third party. The level of accuracy of the data is improved by virtue of the fact that a human reviews the data, which has been organized by the CPU according to particular predefined rules, to determine the accuracy of the data and changes the data as necessary to reflect a particular level of accuracy. (See Response at 9.) The system receives the human verification input just as it does the original data and assimilates and filters the data accordingly before transmitting it to a third party, i.e., data processing. We agree with the Examiner that the claim does not recite an improvement to the structure or function of a computer. (Final Action 24.) The flagging and filtration system of the CPU is simply configured to perform basic data manipulation, including taking into account input from additional non-computer sources (i.e., the verification information from the health care professional). (See, id. at 26.) Appeal 2018-007708 Application 13/920,718 11 The capability of the CPU to transmit the processed data is insignificant post-solution activity akin to displaying results of data processing. MPEP § 2106.05(f)(2) (“Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not provide significantly more.”). In light of the foregoing, we conclude that claim 11 is directed to the abstract idea of receiving, analyzing, modifying, and transmitting data, not a technological implementation or application of that idea. We disagree with Appellant that the fact that the claim is a “set of ordered and specific steps” means that it is a “technical improvement to technologies” (Appeal Br. 13–14). Cf. Trading Techs. Int’l, Inc. v. IBG LLC, No. 2017-2257, 2019 WL 1716242, at *3 (Fed. Cir. 2019) (“This invention makes the trader faster and more efficient, not the computer. This is not a technical solution to a technical problem.”). Appellant relies on part of the identified abstract ideas to provide an inventive concept (the automated categorization and updating of the data reflecting changes to the data by the health care professional), but an “abstract idea itself cannot supply the inventive concept.”’ Trading Techs. Int’l, Inc. v. IBGLLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019) (quoting SAP Am., Inc. v. InvestPic, LLC, 898 F.3d 1161, 1170 (Fed. Cir. 2018)). Appellant’s solution to accuracy of patient data obtained from remote collection being provided to a client is not a technical improvement to any technology, despite that the patient data that was collected remotely that is supplied via a computer to the third party Appeal 2018-007708 Application 13/920,718 12 client has been improved as to accuracy by an intermediate human review of that data. Appellant’s reliance on Bascom Global Internet Services, Inc. v. AT&T Mobility LLC, 827 F.3d 1341 (Fed. Cir. 2016), is unavailing (Appeal Br. 11–12). In Bascom, although the underlying idea of filtering Internet content was deemed to be abstract, under step 2 of the Alice analysis, the claims carved out a specific location for the filtering system, namely a remote Internet service provider (ISP) server, and required the filtering system to give users the ability to customize filtering for their individual network accounts. Bascom, 827 F.3d at 1352. The Federal Circuit recognized that Bascom’s installation of an Internet content filter at a particular network location is “a technical improvement over prior art ways of filtering such content,” because such an arrangement advantageously allows the Internet content filter to have “both the benefits of a filter on a local computer and the benefits of a filter on the ISP server” and “give[s] users the ability to customize filtering for their individual network accounts.” Unlike the claims in Bascom, claim 11 does not cover a technology-based solution that improves an existing technological process. The improvement is to the non-technical process of data management, not in any computing resources used to implement that process. Similarly, we do not find Appellant’s reliance on DDR Holdings, convincing (Appeal Br. 13–14). In DDR Holdings, the Federal Circuit evaluated the eligibility of claims “address[ing] the problem of retaining website visitors that, if adhering to the routine, conventional functioning of Internet hyperlink protocol, would be instantly transported away from a host’s website after ‘clicking’ on an advertisement and activating a Appeal 2018-007708 Application 13/920,718 13 hyperlink.” 773 F.3d at 1257. The Federal Circuit found the claims in DDR Holdings to be patent-eligible because “the claimed solution is necessarily rooted in computer technology in order to overcome a problem specifically arising in the realm of computer networks.” Id. Specifically, the Federal Circuit found the claims addressed the “challenge of retaining control over the attention of the customer in the context of the Internet.” Id. at 1258. Here, by contrast, Appellant’s claim does not address a problem specifically arising in the realm of computer technology. The problem of inaccurate data is not a computer technology specific problem and Appellant’s claim 11 recites a system that uses generic computing elements. In DDR Holdings, the computer network was not operating in its “normal, expected manner” and the claims did not “recite an invention that is [] merely the routine or conventional use of the Internet.” Id. at 1258–59. Appellant does not direct attention to, and we do not see, where the Specification describes computer components acting in an unconventional manner to further the desired solution of transmitting more accurate information to a third party client, and the claim does not recite such an unconventional interaction. Nor does the claim recite a specific technological way of accessing the data that is analyzed in an unconventional way. As we noted above, the improvement to data accuracy arises from the human review of the data before it is transmitted to the client. Appellant contends the claimed invention does not monopolize or pre- empt the relevant technical field. (See Appeal Br. 13.) Such an argument does not compel finding the claim patent-eligible. “While preemption may signal patent ineligible subject matter, the absence of complete preemption does not demonstrate patent eligibility” or render a claim any less abstract. Appeal 2018-007708 Application 13/920,718 14 Ariosa Diagnostics, Inc. v. Sequenom, Inc., 788 F.3d 1371, 1379 (Fed. Cir. 2015); OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1362–63 (Fed. Cir. 2015) (“[T]hat the claims do not preempt all price optimization or may be limited to price optimization in the e-commerce setting do not make them any less abstract.”). STEP 2B: We next turn to Step 2B to evaluate whether claim 1 provides an inventive concept. Guidance at 54. Neither a finding of novelty nor a non- obviousness determination automatically leads to the conclusion that the claimed subject matter is patent-eligible. Although the second step in the Mayo/Alice framework is termed a search for an “inventive concept,” the analysis is not an evaluation of novelty or non-obviousness, but rather, a search for “an element or combination of elements that is ‘sufficient to ensure that the patent in practice amounts to significantly more than a patent upon the [ineligible concept] itself.”’ Alice, 573 U.S. at 217–18 (alteration in original, citation omitted). Step 2B requires that we look to whether the claim “adds a specific limitation” beyond the judicial exception that is not “well-understood, routine, conventional activity in the field.” Guidance at 56. Using generic computer components to perform abstract ideas does not provide the necessary inventive concept. See Alice, 573 U.S. at 223 (“[T]he mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention.”); see also SAP, 898 F.3d at 1163, 1170 (finding that the invocation of such computers for use in carrying out improved mathematical calculations amounts to recitation of what is well- understood, routine, and conventional). Also, “steps that do nothing more Appeal 2018-007708 Application 13/920,718 15 than spell out what it means to ‘apply it on computer’ cannot confer eligibility.” Intellectual Ventures I LLC v. Capital One Bank, 792 F.3d 1363, 1370 (Fed. Cir. 2015). Furthermore, the fact that the system sends data to a human for verification and receives input from the verification to update the records is not inventive. Indeed, the Examiner has cited a number of prior art references that demonstrate human review of remotely collected information for verification is standard practice. (See, e.g., Pratt ¶¶ 12, 62, 117–118.) We conclude the claim fails to add a specific limitation beyond the judicial exception that is not well-understood, routine, and conventional in the field. As discussed above, the Specification describes using conventional components to collect patient data and general purpose computers to analyze and organize the data. The function to be performed by the computer at each step of the process is purely conventional. Using a computer to retrieve, select, and apply decision criteria to data and modify the data as a result amounts to electronic data query and retrieval—one of the most basic functions of a computer. None of these activities are used in some unconventional manner nor do any produce some unexpected result. Appellant does not contend it invented any of these activities. In short, the claim does no more than require a generic computer to perform generic computer functions. Considered as an ordered combination, the computer components of Appellant’s claim 11 and their configuration to receive, analyze, modify, and transmit data add nothing that is not already present when they are considered separately. The sequence of data reception-analysis- modification-transmission is equally generic and conventional and Appeal 2018-007708 Application 13/920,718 16 “amount[s] to ‘nothing significantly more’ than an instruction to apply [an] abstract idea” using generic computer technology. See Alice, 573 U.S. at 226 (quoting Mayo, 566 U.S. at 79); see also Elec. Power, 830 F.3d at 1354–56 (holding that the sequence of gathering, analyzing, and displaying in real-time was abstract); Inventor Holdings, LLC v. Bed Bath & Beyond, Inc., 876 F.3d 1372, 1378 (Fed. Cir. 2017) (holding that sequence of data retrieval, analysis, modification, generation, display, and transmission was abstract). In light of the foregoing, we are not persuaded of error in the Examiner’s determination that claim 11 is directed to an abstract idea and that the limitations of that claim do not transform it into significantly more than the abstract idea. Claims 12–15 and 17–30 have not been argued separately and therefore fall with claim 11. 37 C.F.R. § 41.37(c)(1)(iv). II New Matter The Examiner finds that claims 25 and 29 lack written description as to the limitation “wherein is further configured to prevent, based on one or more categories of the patient data, the health care professional from verifying transmission of the patient data to the client.” (Final Action 6.) Appellant does not contest the Examiner’s rejection of claims 25 and 29 as failing to comply with the written description requirement. Thus, we summarily affirm this rejection. See MPEP § 1205.02 (“If a ground of rejection stated by the examiner is not addressed in the appellant’s brief, appellant has waived any challenge to that ground of rejection and the Board Appeal 2018-007708 Application 13/920,718 17 may summarily sustain it, unless the examiner subsequently withdrew the rejection in the examiner’s answer.”). III Non-Obviousness The Examiner finds that Knapp “discloses a system for improving the accuracy of patient data transmitted to a client.” (Final Action 7.) The Examiner finds that the personal health digitizer is a remote patient monitoring apparatus that is configured to collect patient data, that the information management system is a CPU that has a flagging and filtration system and a database storing a predefined rule set, and that the health digitizer is in communication with the flagging and filtration system. (Id. at 7–9.) In particular, the Examiner asserts that the filtering steps implemented in steps 306, 307, 309, and 310 have pattern recognition subroutines that flag patterns generated from the personal health digitizer. (Id.) The Examiner also finds that Knapp discloses a health care communication device that is in communication with the flagging and filtration system. (Id. at 8.) The Examiner acknowledges that Knapp is silent regarding categorizing and flagging patient data to create the categorization and flagging information that specifies a level of accuracy for the patient data, as well as presenting the data with the categorization data on the health care communication device and updating patient data made to levels of accuracy during verification by the health care professional, filter that data and transmitting the filtered data. (Id. at 9–10.) According to the Examiner “Pratt teaches that it was old and well known in the art of monitoring patient medical conditions at the time of the invention to categorize and flag the patient data based on the predefined rule Appeal 2018-007708 Application 13/920,718 18 set to create categorization and flagging information on the health care communication device for verification by a health care professional.” (Final Action 10.) The Examiner states: In ¶0012, verified patient data and portions of patient data have levels of accuracy such as “inconsistent”, “accurate” and “not to be accurate” categories, when compared to patient data previously stored in a database. The additional measurement mentioned in ¶0120, serve as changes made to obtain the most accurate level of patient information possible, where a healthcare provider is a physician (¶0121). In 0010 Pratt flags possible errors and presents them to the user for validation. (Id. at 10–11.) The Examiner concludes from the foregoing that it would have been obvious to one of ordinary skill in the art of monitoring patient medical conditions at the time of the invention to modify the method, software and system of Knapp to categorize and flag the patient data based on the predefined rule set to create categorization and flagging information on the health care communication device for verification by a health care professional, the patient data with the categorization and flagging information, update the patient data in the database to reflect changes made to the levels of accuracy specified by the categorization and flagging information during the verification by the health care professional, filter the patient data based on the levels of accuracy specified in the categorization and flagging information, including the changes made to the levels of accuracy, to create a portion of the updated patient data and transmit the portion of the updated patient data to a client, as taught by Pratt, to provide standardized time saving information to appreciate healthcare providers when there are life-threading conditions of a patient. (Id. at 11.) The Examiner has the initial burden of establishing a prima facie case of obviousness under 35 U.S.C. § 103. In re Oetiker, 977 F.2d 1443, 1445 (Fed. Cir. 1992) (“[T]he examiner bears the initial burden, on review of the Appeal 2018-007708 Application 13/920,718 19 prior art or on any other ground, of presenting a prima facie case of unpatentability”). We disagree with the Examiner that Knapp discloses a system for improving the accuracy of patient data transmitted to a client in a way that one would find the teachings of Pratt regarding verification of categorized and flagged data by a health care professional relevant. While Knapp certainly indicates that “data that is collected by the patient must be accurate in nature” (Knapp 3:13–17), Knapp is concerned with “data collected [having] sufficient baseline data to enable the medical practitioner to detect anomalies in the pattern of data that is collected over time” (Knapp 3:17–20). Knapp states “there is presently no mechanism available to collect data on a frequent basis and communicate this data to the medical practitioner, or to perform an automated pattern analysis function on such data, if collected, or to automate a report to the consumer.” (Id. at 3:20–24.) The Knapp invention is directed to improving the accuracy of data analysis by comparing it to sufficient baseline data. As stated by Knapp: the information management system collects a statistically valid volume of data from numerous consumers and performs pattern matching and other statistical analyses on this data in a multi- dimensional manner to thereby deliver relevant information to the various classes of users who access the information management system. (Id. at 3:37–42.) Pratt is directed to a system “for gathering, formatting, validating, storing and/or distributing medical information” where the patient information submitted is timely validated. (Pratt ¶¶ 6, 7, 10, 12, 39.) Pratt explains the validation of data by the physician upon review of data from the medical device or physician entered data such as the reasonableness of objective data, e.g., height or weight of a patient, or the reasonableness of Appeal 2018-007708 Application 13/920,718 20 diagnosis. (Id. ¶¶ 115–119.) Pratt explains that in some instance “to obtain the most accurate information possible, it sometimes is beneficial to perform multiple or back-up measurements” and the system can prompt a physician to do so. (Id. ¶ 120.) Pratt explains that [i]f the system requests back-up measurements, but the physician has not entered them, the system will prompt the physician to enter the additional measurement data (block 619). They physician can elect to enter the additional measurements (block 619), or the physician can elect not to enter the additional measurement. If the additional measurements are not entered, the system can flag the measurement data as being less reliable. (Id.) Other than to assert that both Knapp and Pratt include in their systems flagging and filtration systems where data is categorized based on predefined rule sets, the Examiner has not identified what about these systems are equivalent components such that one of ordinary skill in the art would combine the data verification teaching in Pratt with the Knapp system. In other words, the Examiner does not explain how or why the similarities of these systems is such that one of ordinary skill in the art would understand them to perform similar functions such that one would have categorized and flagged patient data based on the predefined rule set that flag patterns generated from the personal health digitizer of Knapp to create a categorization and flagging information that specifies a level of accuracy for the patient data in Pratt. Nor do we find the reason to be self- evident from the teachings of the references. “[R]ejections on obviousness grounds cannot be sustained by mere conclusory statements.” KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 418 (2007) (alteration in original). Appeal 2018-007708 Application 13/920,718 21 In view of the foregoing, we reverse the Examiner’s rejection of claims 11, 15, 23, 24, 26–28, and 30 under 35 U.S.C. § 103(a) as unpatentable over Knapp and Pratt. The additional references relied on by the Examiner in the remaining obviousness rejections do not cure the defect just discussed. Therefore, we also reverse the rejection of: claims 12–14 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Banet; claims 17–20 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Papadopoulos; claims 21 and 22 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Shusterman; and claims 25 and 29 under 35 U.S.C. § 103(a) as unpatentable over Knapp, Pratt, and Muradia. CONCLUSION In summary: Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 11–15, 17– 30 101 11–15, 17– 30 11, 15, 23, 24, 26–28, 30 103 Knapp, Pratt 11, 15, 23, 24, 26–28, 30 12–14 103 Knapp, Pratt, Banet 12–14 17–20 103 Knapp, Pratt, Papadopoulos 17–20 21, 22 103 Knapp, Pratt, Shusterman 21, 22 Appeal 2018-007708 Application 13/920,718 22 25, 29 103 Knapp, Pratt, Muradia 25, 29 25, 29 112, 1st paragraph New matter 25, 29 Overall Outcome 11–15, 17– 30 TIME PERIOD FOR RESPONSE No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). AFFIRMED Copy with citationCopy as parenthetical citation