Michael Chun-Chieh. Lee et al.Download PDFPatent Trials and Appeals BoardSep 18, 201913061959 - (D) (P.T.A.B. Sep. 18, 2019) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/061,959 03/03/2011 Michael Chun-chieh Lee 2008P01289WOUS 1727 24737 7590 09/18/2019 PHILIPS INTELLECTUAL PROPERTY & STANDARDS 465 Columbus Avenue Suite 340 Valhalla, NY 10595 EXAMINER COLEMAN, CHARLES P. ART UNIT PAPER NUMBER 3626 NOTIFICATION DATE DELIVERY MODE 09/18/2019 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): katelyn.mulroy@philips.com marianne.fox@philips.com patti.demichele@Philips.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________________ Ex parte MICHAEL CHUN-CHIEH LEE and LILLA BOROCZKY ____________________ Appeal 2018-0052801 Application 13/061,959 Technology Center 3600 ____________________ Before MURRIEL E. CRAWFORD, PHILIP J. HOFFMANN, and BRADLEY B. BAYAT, Administrative Patent Judges. CRAWFORD, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE This is an appeal from the final rejection of claims 1–6 and 16. We have jurisdiction to review the case under 35 U.S.C. §§ 134(a) and 6(b). The invention relates generally to medical diagnosis. Spec. 1, line 2. 1 The Appellants identify Koninklijke Philips Electronics N. V. as the real party in interest. Appeal Br. 2. Appeal 2018-005280 Application 13/061,959 2 Claim 16 is illustrative: 16. A method for applying a computer-aided diagnosis that has been split into parts, the method comprising: defining input patient data comprising at least imaging data; applying, by a processor, image classifiers derived from different patient strata to the input patient data to generate a plurality of diagnostic hypotheses of the patient, two or more of the diagnostic hypotheses being based on two or more different hypothesized values of additional clinical information; storing, by the processor, the plurality of diagnostic hypotheses in a computer-readable database; requesting, by the processor, input of an actual value of the additional clinical information about the patient based on the plurality of diagnostic hypotheses; applying, by the processor a clinical decision tree to the additional clinical information about the patient to determine strata to which the patient belongs; and using this stratification to select a final diagnosis from the plurality of diagnostic hypotheses. The Examiner rejected claims 1–6 and 16 under 35 U.S.C. § 101 as directed to ineligible subject matter in the form of abstract ideas. The Examiner rejected claims 1–6 and 16 under 35 U.S.C. § 112, second paragraph, as indefinite. We AFFIRM. ANALYSIS Indefiniteness In the Final Action, the Examiner rejected all pending claims under 35 U.S.C. § 112, second paragraph, as indefinite, based on two different claim limitations. Final Act. 3–4. Specifically, the Examiner finds in regard to claims 1 and 16, that it “is unclear what data/probabilities the Appeal 2018-005280 Application 13/061,959 3 ‘corresponding probabilities of illness’ are based upon, and how the ‘corresponding probabilities of illness’ differ from the likelihood of the patient having the illness,” and it “is unclear how the hypothesized values are generated, and it is also clear what a ‘select likelihood of the plurality of likelihoods corresponding . . . ’ encompasses.” Id. The Appellants advance argument, relative to claim 1, that asserts error on the part of the Examiner, supported by claim language and references to the Specification: page 4, lines 18–26; page 7, line 26, to page 8, line 1; and page 8 lines 17–22. Appeal Br. 6–9. The Appellants also explain that independent claim 16 does not recite the language in claim 1 that led to the rejection. Appeal Br. 9–10. The Examiner indicates, shortly after the filing of the Appeal Brief, that claim amendments filed two days before the filing of the Appeal Brief will be entered, “to fix antecedent problem.” Advisory Act. 2, mailed July 28, 2017. However, we find no pending rejection in the Final Action related to antecedent basis. We also find the Examiner is not responsive to the Appellants’ arguments as to indefiniteness. See Reply Br. 2 (“the Examiner's Answer does not state whether the Examiner disagrees with the applicants’ arguments and does not present any reasons for disagreement”). Indeed, the Answer does not mention any rejection under 35 U.S.C. § 112. We find the Appellants’ argument and evidence as to the definiteness of claims to be persuasive, for the reasons advanced in the Appeal Brief. The Examiner has not responded, with additional argument or evidence. We do not sustain the rejection of claims as indefinite under 35 U.S.C. § 112, second paragraph. Appeal 2018-005280 Application 13/061,959 4 Patentable Subject Matter An invention is patent-eligible if it claims a “new and useful process, machine, manufacture, or composition of matter.” 35 U.S.C. § 101. However, the Supreme Court has long interpreted 35 U.S.C. § 101 to include implicit exceptions: “[l]aws of nature, natural phenomena, and abstract ideas” are not patentable. E.g., Alice Corp. v. CLS Bank Int’l, 573 U.S. 208, 216 (2014). In determining whether a claim falls within an excluded category, we are guided by the Supreme Court’s two-step framework, described in Mayo and Alice. Id. at 217–18 (citing Mayo Collaborative Servs. v. Prometheus Labs., Inc., 566 U.S. 66, 75–77 (2012)). In accordance with that framework, we first determine what concept the claim is “directed to.” See Alice, 573 U.S. at 219 (“On their face, the claims before us are drawn to the concept of intermediated settlement, i.e., the use of a third party to mitigate settlement risk.”); see also Bilski v. Kappos, 561 U.S. 593, 611 (2010) (“Claims 1 and 4 in petitioners’ application explain the basic concept of hedging, or protecting against risk.”). Concepts determined to be abstract ideas, and thus patent ineligible, include certain methods of organizing human activity, such as fundamental economic practices (Alice, 573 U.S. at 219–20; Bilski, 561 U.S. at 611); mathematical formulas (Parker v. Flook, 437 U.S. 584, 594–95 (1978)); and mental processes (Gottschalk v. Benson, 409 U.S. 63, 69 (1972)). Concepts determined to be patent eligible include physical and chemical processes, such as “molding rubber products” (Diamond v. Diehr, 450 U.S. 175, 191 (1981)); “tanning, dyeing, making water-proof cloth, vulcanizing India rubber, smelting ores” (id. at 182 n.7 (quoting Corning v. Burden, 56 U.S. Appeal 2018-005280 Application 13/061,959 5 252, 267–68 (1854))); and manufacturing flour (Benson, 409 U.S. at 69 (citing Cochrane v. Deener, 94 U.S. 780, 785 (1876))). In Diehr, the claim at issue recited a mathematical formula, but the Supreme Court held that “[a] claim drawn to subject matter otherwise statutory does not become nonstatutory simply because it uses a mathematical formula.” Diehr, 450 U.S. at 187; see also id. at 191 (“We view respondents’ claims as nothing more than a process for molding rubber products and not as an attempt to patent a mathematical formula.”). Having said that, the Supreme Court also indicated that a claim “seeking patent protection for that formula in the abstract . . . is not accorded the protection of our patent laws, . . . and this principle cannot be circumvented by attempting to limit the use of the formula to a particular technological environment.” Id. (citing Benson and Flook); see, e.g., id. at 187 (“It is now commonplace that an application of a law of nature or mathematical formula to a known structure or process may well be deserving of patent protection.”). If the claim is “directed to” an abstract idea, we turn to the second step of the Alice and Mayo framework, where “we must examine the elements of the claim to determine whether it contains an ‘inventive concept’ sufficient to ‘transform’ the claimed abstract idea into a patent- eligible application.” Alice, 573 U.S. at 221 (quotation marks omitted). “A claim that recites an abstract idea must include ‘additional features’ to ensure ‘that the [claim] is more than a drafting effort designed to monopolize the [abstract idea].’” Id. (quoting Mayo, 566 U.S. at 77). “[M]erely requir[ing] generic computer implementation[] fail[s] to transform that abstract idea into a patent-eligible invention.” Id. Appeal 2018-005280 Application 13/061,959 6 The PTO recently published revised guidance on the application of § 101. 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50 (Jan. 7, 2019) (“Guidance”). Under the Guidance, we first look to whether the claim recites: (1) any judicial exceptions, including certain groupings of abstract ideas (i.e., mathematical concepts, certain methods of organizing human activity such as a fundamental economic practice, or mental processes); and (2) additional elements that integrate the judicial exception into a practical application (see Manual of Patent Examining Procedure (“MPEP”) § 2106.05(a)–(c), (e)–(h)). Only if a claim (1) recites a judicial exception and (2) does not integrate that exception into a practical application, do we then look to whether the claim: (3) adds a specific limitation beyond the judicial exception that is not “well-understood, routine, conventional” in the field (see MPEP § 2106.05(d)); or (4) simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. See Guidance. The Examiner finds the claims “are directed to the abstract idea of ‘fusing clinical and image features for computer-aided diagnosis’2, etc.” Answer 3; see also Final Act. 2: (Claim(s) 1-6 and 16 is/are directed to the abstract ideas a method and system for implementing a method that stores prior diagnoses and corresponding probabilities of the illness and assesses the plurality of likelihoods of the illness based on: the prior diagnoses, the image data and proposed possible values of clinical data (i.e. comparing new and stored information, and using rules to identify options) and generates a plurality of likelihoods of an illness based on the processed medical image 2 This phrase appears in the Specification at page 6, lines 22–27. Appeal 2018-005280 Application 13/061,959 7 data alone and based on potential values for additional clinical data; and requests and receives an actual value of the clinical data based on a result of the one or more classifier algorithms (i.e. organizing information through mathematical correlations).) Claim 16 recites a “method for applying a computer-aided diagnosis that has been split into parts.” Diagnosis, including medical diagnosis of illness, is a mental process routinely performed by, for example, physicians, using information available to the physician, and mental thought processes. The method involves the following steps. First recited is “defining input patient data comprising at least imaging data.” The Specification explains that the “CADx algorithm combines image data in CT images (data-type 1, in this example) with clinical data in the clinical parameters (considered data-type 2 in this example) of the patient (e.g. emphysema status, lymph node status).” Spec. 3, lines 27–30 (cited at Appeal Br. 5). We construe this as a data gathering step, because it collects both image and clinical data, which can fairly be characterized as insignificant extra-solution activity. See Bilski v. Kappos, 545 F.3d 943, 963 (Fed. Cir. 2008) (en banc), aff’d sub nom Bilski v. Kappos, 561 U.S. 593 (2010) (characterizing data gathering steps as insignificant extra-solution activity). The method next recites: applying, by a processor, image classifiers derived from different patient strata to the input patient data to generate a plurality of diagnostic hypotheses of the patient, two or more of the diagnostic hypotheses being based on two or more different hypothesized values of additional clinical information. The Specification explains deriving image classifiers involves using a decision tree “to stratify the patients in the training data base, also known as yielding patient strata.” Spec. 7, lines 14–15 (cited at Appeal Br. 5). The Specification explains, as to generating diagnostic hypotheses, that “a wide Appeal 2018-005280 Application 13/061,959 8 variety of image-based classifiers are developed and then use the clinical data to decide which classifiers to use for different clinical ‘risk’ groups. The classifiers thus created are subsequently used for the computer-aided diagnosis of new, previously unseen patients 500.” Spec. 9, lines 20–23 (cited at Appeal Br. 5). As to the two diagnostics hypothesized, the Specification explains a “decision engine” accesses databases “to compute a partial diagnosis based on the image data 910 and further compute potential complete diagnoses based on possible clinical data.” Spec. 13, lines 10–14 (cited at Appeal Br. 5). We construe this as using a computer, to think like a physician, specifically to hypothesize medical diagnoses based on past image and clinical data stores about other patient illnesses. The method continues by “storing, by the processor, the plurality of diagnostic hypotheses in a computer-readable database.” The Specification describes that the “results of the computations are further stored in decision database 930.” Spec. 13, line 23 (cited at Appeal Br. 5). This is merely a step to remember, or store data. After storing the developed hypotheses, the method continues by “requesting, by the processor, input of an actual value of the additional clinical information about the patient based on the plurality of diagnostic hypotheses.” The Specification describes that “the request for additional clinical data is sent to interface engine 980 with display terminal 970 which queries the operator for additional information.” Spec. 13, lines 14–17 (cited at Appeal Br. 6). We view this as another data-gathering step, which is insignificant extra-solution activity. Then, the claim recites, “applying, by the processor a clinical decision tree to the additional clinical information about the patient to determine strata to which the patient belongs.” The Specification describes that a new Appeal 2018-005280 Application 13/061,959 9 “case is sent to a clinical decision tree 620 similar to the clinical decision tree 330 and 530 of Figs 3 and 5 respectively. One of two alternate paths is selected.” Spec. 11, lines 1–5 (cited at Appeal Br. 6). We view this as a mental thought step, because it involves determinations based on data, implemented on a computer. Finally, the claim recites, “using this stratification to select a final diagnosis from the plurality of diagnostic hypotheses.” The Specification describes that result of the clinical decision tree is the use of paths to select which path is activated. The active path allows the result of one of the two image-based classifier ensemble results, (either the high risk result or low risk result), to be stored in the likelihood of malignancy module 660. Spec. 11, lines 13–16 (cited at Appeal Br. 6). This again is a mental step, which involves selecting that a feature is either high or low in risk, based on the examined data. The claim thus describes a process for medical risk diagnosis, involving creating and selecting from hypothetical diagnoses derived from past image and clinical data. Although the claim recites using a “processor” for four of the six recited steps (applying, requesting, and storing), the process, in general, merely mimics the mental process of medical diagnosis based on image and clinical data which, we have no doubt, physicians have performed for decades. For example, the Specification describes, in background, that “the decision-making process associated with evaluation of malignancy typically includes integration of non-imaging evidence,” and “studies have demonstrated that both diagnostic ratings and perception of radiological features are affected by patient histories.” Spec. 1, lines 7–12. Appeal 2018-005280 Application 13/061,959 10 This supports the idea that diagnoses made by medical professionals based on multiple types of data are well known. Under Prong One of Revised Step 2A of the Guidance, the claim is fundamentally directed to a mental process, using image and clinical data to determine whether collected image data represents high or low risk to the patient. See Guidance at 52, fn. 14 (“If a claim, under its broadest reasonable interpretation, covers performance in the mind but for the recitation of generic computer components, then it is still in the mental processes category unless the claim cannot practically be performed in the mind.”). The recited components beyond the recited abstract mental process is only a “processor.” The Specification describes that the “system uses software which processes the data from the data repository 111. The software is run on a processor 102 which implements the incomplete data on a CADx algorithm 121 based system.” Spec. 6, lines 28–30. The Specification further describes: a computer operable apparatus including, but not limited to a computer database data storage embedded within a computer memory, a computer output display terminal, a keyboard for inputting data, an interface for the import and extraction of data and any hardware and software components needed to make the proposed application function. Spec. 6, lines 22–27 (emphasis added). We interpret this to mean a general- purpose computer is sufficient for performing the method. For example, the Specification describes that a “computer operable software means comparator step 160 compares the N different candidate CADx calculation results or potential solutions for the likelihood of malignancy and decides if they are within a pre-set tolerance.” Spec. 4, lines 27–29. General-purpose Appeal 2018-005280 Application 13/061,959 11 computers are capable of comparing data and determining if a value is within a preset range. The Specification thus supports that the abstract mental process of claim 16 can be implemented on a general-purpose computer. Under Prong Two of Revised Step 2A of the Guidance, we are directed to determine if the abstract idea expressed in the claim language is “integrated into a practical application.” Guidance at 54. Here, claim 16 recites steps that receives, categorizes, stores, and requests data, make determinations from the data, and makes a selection based on the determinations from the data. The method does not improve the underlying “processor” recited as performing some limitations of claim 16, because any computer can be used to execute the claimed method. See Spec. 6, lines 22–27. In addition, the method is directed to medical diagnosis (Spec. 1, line 2), and as such the claimed method does not improve another technology. MPEP § 2106.05(a). Because a particular computer is not required, the claim also does not define or rely on a “particular machine.” MPEP § 2106.05(b). Further, the method does not transform matter. MPEP § § 2106.05(c). As such, the method has no other meaningful limitations (MPEP § 2106.05(e)), and thus merely recites instructions to execute the abstract idea on a computer (MPEP § 2106.05(f)). Therefore, we determine that claim 16 does not integrate the judicial exception into a practical application, under Prong Two of Revised Step 2A. Under Step 2B of the Guidance, we determine if the additional elements outside the scope of the abstract idea offer “something more” in the form of an “inventive concept” that would transform the abstract idea into eligible subject matter. We find no such limitations. Appeal 2018-005280 Application 13/061,959 12 Beyond the use of image and clinical data to determine whether collected image data represents high or low risk to the patient, the claim only additionally recites the use of a “processor” for four limitations. As we noted above, the processor is called on to request and receive data, categorize data, compare data, and make determinations about data such as whether it meets a tolerance criteria. The operations of storing, analyzing, receiving, and writing data are primitive computer operations found in any computer system. See In re Katz Interactive Call Processing Patent Litig., 639 F.3d 1303, 1316 (Fed. Cir. 2011) (“Absent a possible narrower construction of the terms ‘processing,’ ‘receiving,’ and ‘storing,’ discussed below, those functions can be achieved by any general purpose computer without special programming.”). Because the operations the processor performs, request, receive, analyze, and store, are basic operations, and the processor is a general-purpose computer, there are no additional limitations that are not well-understood, routine, and conventional. Thus, the claim does not recite an “inventive concept.” As a result, we agree with the Examiner that claim 16 is directed to an abstract idea, which is not integrated into a “practical application,” and which does not recite an “inventive concept.” Independent claim 1, though similar to claim 16 in operation, recites language we must consider separately. Claim 1 is directed to “a system for providing interactive computer- aided analysis of medical images.” An algorithm for analyzing medical images is essentially a method of organizing human behavior, because it specifies how a human should undertake steps for medical diagnosis. The first component of the claimed system is “an image processor that processes medical image data of a patient.” In support of the limitation, the Appeal 2018-005280 Application 13/061,959 13 Appellants direct us to Spec. 4, lines 2–11. There, the image processor is described as retrieving image data, such as from a hospital’s PACS system, and “applying a CADx algorithm.” Spec. 4, lines 2–7 (cited at Appeal Br. 4). The Specification provides an example of one algorithm. Based on the broad language of the claim, we construe the step that “processes” as applying any algorithm, because the claim language does not recite any particular algorithm, and we do not import the example from the Specification into the claim. See Superguide Corp. v. DirecTV Enterprises, Inc., 358 F.3d 870, 875 (Fed. Cir. 2004) (“Though understanding the claim language may be aided by explanations contained in the written description, it is important not to import into a claim limitations that are not part of the claim. For example, a particular embodiment appearing in the written description may not be read into a claim when the claim language is broader than the embodiment.”) Thus, the image processor is a component that applies an algorithm to received image data, much the way a human looking at image data applies a mental algorithm to examine the image. The next recited component is “a database of prior diagnoses and corresponding probabilities of an illness.” This is a collection of data, similar to information about prior diagnoses and probabilities that a human may remember in their mind, or by looking at records of illnesses. The next recited component is “a decision engine that generates a plurality of likelihoods of the patient having the illness.” The likelihoods are based on “the prior diagnoses and corresponding probabilities of the illness, the processed medical image data, and a plurality of different hypothesized values of additional clinical data.” The Specification describes that the method “tests different proposed possible values of” clinical data, and “results are computed.” Spec. 4, lines 14–19 (cited at Appeal Br. 5). In Appeal 2018-005280 Application 13/061,959 14 addition, the Specification describes that “data is processed using a processor 102 which includes software that performs at least one of three estimates.” Spec. 6, lines 30–32 (cited at Appeal Br. 5). These citations do not explicitly describe the generation of likelihoods of the patient having the illness, but the Specification also describes that stratified data 550 is a series of at least one case 552 to N cases 554 generated to determine if an individual case presents a high risk 556 or a low risk 558 of possessing a given illness or condition based on whether the probability of a person with a specific health background is likely to have or not have a given illness or condition, i.e. based on the information contained within the clinical data 510. Spec. 10, lines 14–18. We construe the generation of likelihoods to involve an analysis of historical data, to which calculations or estimates, or both, are applied. Selecting hypotheses, based on calculations and estimates from available historical data, is a task that humans can, and do, perform mentally through thought. The next recited component is “an interface engine that: determines whether a significant difference exists among the plurality of likelihoods,” and then takes one of two different actions, based on the determination. The Specification describes that the engine “compares the N different candidate CADx calculation results or potential solutions for the likelihood of malignancy and decides if they are within a pre-set tolerance.” Spec. 4, lines 27–29 (cited at Appeal Br. 5). The Specification also describes that the system acts “to evaluate whether the diagnosis based on incomplete data is significantly different than estimated diagnoses created with complete data.” Spec. 7, lines 1–3 (cited at Appeal Br. 5). Acts of comparison of data, and evaluation of data for differences are acts a human performs mentally through thought. Appeal 2018-005280 Application 13/061,959 15 The final recited portion of claim 1 involves executing one of two steps, for each ends the engine “identifies the select likelihood based on the plurality of likelihoods.” This identifying is a step that can be performed mentally by a person, by making a decision based on the available data. In one of the two steps, “if a significant difference exists among the plurality of likelihoods: requests and receives an actual value of the additional clinical data.” That is, if additional information is needed to make a decision, because there is a difference in the data that prevents a choice, additional data is requested and used for the decision. This is a step that humans, especially medical professionals, routinely perform mentally, by understanding that more information is needed before a decision on the information can be made. Other than the claimed “image processor,” “database,” “decision engine,” and “interface engine,” the claimed system recites functions that are able to be performed by a human using thought, memory, and potentially, pen and paper. The Specification describes the processors, database, and engines, as follows: “The system uses software which processes the data from the data repository 111. The software is run on a processor 102 which implements the incomplete data on a CADx algorithm 121 based system. The data is processed using a processor 102 which includes software.” Spec 6, lines 28–32 (cited at Appeal Br. 4–5). The Specification provides no additional description of the software or computer system that operates the software, so we must assume they are general purpose in nature. Under Prong One of Revised Step 2A of the Guidance, claim 1, similar to claim 16, is directed to the abstract idea of a mental process of using image and clinical data to determine whether collected image data represents high or low risk to the patient, implemented with general-purpose Appeal 2018-005280 Application 13/061,959 16 software and computer hardware. As with claim 16, the system of claim 1 does not improve the underlying “processor,” “database,” or “engines” recited as performing the limitations, because any unspecified computer can be used to execute the claimed functions. See Spec. 6, lines 22–27. In addition, the method is directed to medical diagnosis (Spec. 1, line 2), and as such the claimed method does not improve another technology. MPEP § 2106.05(a). Because a particular computer is not required, the claim also does not define or rely on a “particular machine.” MPEP § 2106.05(b). Further, the method does not transform matter. MPEP § § 2106.05(c). As such, the method has no other meaningful limitations (MPEP § 2106.05(e)), and thus merely recites instructions to execute the abstract idea on a computer (MPEP § 2106.05(f)). Therefore, we determine that claim 1 does not integrate the judicial exception into a practical application, under Prong Two of Revised Step 2A. Further, similar to the analysis for claim 16, claim 1 recites only unspecified, and therefore well-understood, routine, and conventional components beyond the recitation of the abstract mental functions. Therefore, claim 1, like claim 16, does not recite an “inventive concept” that represents “something more” than the abstract idea recited by the claim, under Step 2B of the Guidance. Dependent claim 2 recites “the interface engine displays an average likelihood of the illness on a display terminal when a significant difference does not exist.” This is both the result of a mental calculation, and an output step that is insignificant extra-solution activity. See Bilski v. Kappos, 561 U.S. 593, 610-11 (2010) (“Flook stands for the proposition that the prohibition against patenting abstract ideas ‘cannot be circumvented by attempting to limit the use of the formula to a particular technological Appeal 2018-005280 Application 13/061,959 17 environment’ or adding ‘insignificant post solution activity.’”) (quoting Diehr, 450 U.S. at 191-92). Dependent claim 3 limits the nature of the clinical data used to “at least one of medical history, health history, family history, physical measurements, and demographic data.” Dependent claim 4 limits which data is “used to stratify data.” Dependent claim 5 recites additional functions capable of being performed through mental thought: “quantify risk factors, create an image-based classifier library, and derive an ensemble.” Dependent claim 6 recites that the “interface engine” performs additional assessment and determination steps, which can be performed mentally by a human, because it “assesses the plurality of likelihoods to determine a range of the likelihoods,” and determines if a “significant difference” exists. None of the dependent claims recites subject matter beyond the abstract idea recited in claim 1, and instead the dependent claims merely clarify, limit, or expand the mental functions embodied within the abstract idea of claim 1. We therefore agree with the Examiner that claims 1–6 and 16 recite abstract ideas capable of being performed through mental thought, without additional elements that integrate the abstract idea into a “practical application,” or recite an “inventive concept” that represents “significantly more” than an abstract idea. We turn to and consider the Appellants’ arguments in light of the above analysis. We are not persuaded by the Appellants’ argument that the claims are not directed to an abstract idea because there are no pending prior art rejections. Appeal Br. 11. A finding of novelty or non-obviousness does not automatically lead to the conclusion that the claimed subject matter is Appeal 2018-005280 Application 13/061,959 18 patent-eligible. Although the second step in the Mayo/Alice framework is termed a search for an “inventive concept,” the analysis is not an evaluation of novelty or non-obviousness, but rather, a search for “an element or combination of elements that is ‘sufficient to ensure that the patent in practice amounts to significantly more than a patent upon the [ineligible concept] itself.’” Alice Corp., 134 S. Ct. at 2355. “Groundbreaking, innovative, or even brilliant discovery does not by itself satisfy the § 101 inquiry.” Ass’n. for Molecular Pathology v. Myriad Genetics, Inc., 569 U.S. 576, 591 (2013). A novel and non-obvious claim directed to a purely abstract idea is, nonetheless, patent-ineligible. See Mayo, 566 U.S. at 90. See also Diamond v. Diehr, 450 U.S. 175, 188–89 (1981) (“The ‘novelty’ of any element or steps in a process, or even of the process itself, is of no relevance in determining whether the subject matter of a claim falls within the § 101 categories of possibly patentable subject matter.”). Citing similarity with “Google, and DDR,”3 the Appellants assert the claims are “‘necessarily rooted in computer technology in order to overcome a problem specifically arising in the realm of computer technology’.” Appeal Br. 11. The Appellants describe shortcomings in the algorithms of prior art computer-aided diagnostics programs that “before a practitioner can take advantage of the potential benefits of a computer aided diagnostic system, the practitioner must prescribe all of the tests that are necessary to provide the values of the clinical data required by the particular computer aided diagnostic system,” and prior art database include “clinical data that is irrelevant or redundant.” Appeal Br. 11. 3 No citations are provided by the Appellants, so we have uncertainty as to which cases these refer. Appeal 2018-005280 Application 13/061,959 19 Appellants’ apparent reliance on DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245 (Fed. Cir. 2014) is unavailing, because the claims at issue do not address a problem unique to the Internet. See Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1371 (Fed. Cir. 2015) (“The patent at issue in DDR provided an Internet-based solution to solve a problem unique to the Internet that (1) did not foreclose other ways of solving the problem, and (2) recited a specific series of steps that resulted in a departure from the routine and conventional sequence of events after the click of a hyperlink advertisement. The patent claims here do not address problems unique to the Internet, so DDR has no applicability.” (Citation omitted)). Indeed, humans attempting diagnosis may well seek more data than is actually necessary to make a diagnosis, and may remember information that is irrelevant or redundant. The identified problems are seen in human diagnostic-process shortcomings, or human decisions of functions to implement in software, not Internet or computer-related problems. We are unpersuaded by the Appellants’ argument that the “preprocessing of the data” to generate “hypothetical values for each combination of possible values of each missing clinical data item, and the subsequent determination of the likelihoods of the patient having the illness . . . cannot be performed manually within a reasonable time period.” Appeal Br. 12. We first discern no time constraints within the claim language. Second, merely using a computer as a tool, to more quickly process data that could be processed mentally, does not mean a mental process so implemented is not an abstract idea. See Intellectual Ventures ILLC v. Capital One Bank, 792 F.3d 1363, 1370 (“[M]erely adding computer Appeal 2018-005280 Application 13/061,959 20 functionality to increase the speed or efficiency of the process does not confer patent eligibility on an otherwise abstract idea.”). In arguing that the claims are not directed to an abstract idea, the Appellants assert the “claims are directed to the requesting and receiving of the additional clinical data only if the actual value of the additional clinical data will affect the diagnosis of the patient having, or not having, a particular illness.” Reply Br. 3; see also id. at 4. This accompanies the argument that the claims provide the benefit of avoiding “unnecessary patient testing,” by only requesting additional “clinical data” when necessary. Id. at 2. We reject the Appellants’ characterization of what the claims are directed to, because the claims recite far more than “requesting and receiving of the additional clinical data.” In addition, we note the claims, broadly construed, do not necessarily have the stated benefit, because the Specification describes that “[c]linical data describes aspects of the patient’s health history, family history, physique, and lifestyle, including such elements as, but not limited to, smoking, previous illnesses, and the like,” and does not include test results. Spec. 3, lines 24–26; see also Spec. 7, lines 26–31 (“clinical data 310 is a collection of cases beginning with a first case 312 and proceeding to a given N cases 314 . . . . Each case contains a name or identifier 316 of a patient and a series of attributes . . . [that] include but are not limited to smoking, and exercise, or physical attributes such as but not limited to height and weight.”). Also arguing that the claims are not directed to abstract ideas, the Appellants argue the claims “include the concrete physical step of requesting and receiving additional data only when required.” Reply Br. 4. We are not persuaded that asking for additional information is a concrete, physical step, because a human can do it by speaking or through writing on pen and paper, Appeal 2018-005280 Application 13/061,959 21 which is both part of human mental activity, and essentially data gathering, considered insignificant extra-solution activity, as we noted before. In response to the Examiner’s assertion that the claimed steps correspond to collecting, analyzing, and displaying information in “Electric Power Group,” the Appellants argue “unlike the claimed system and method of Electric Power Group, the applicants’ claims go well beyond merely collecting, analyzing, and displaying results.” Reply Br. 6. We are not persuaded by the Appellants’ argument. First, merely collecting data and displaying results each represent insignificant extra-solution activity. Further, as in the claimed method here, the Court in Electric Power Group, LLC v. Alstom S.A., 830 F.3d 1350 (Fed. Cir. 2016) held that “we have treated analyzing information by steps people go through in their minds, or by mathematical algorithms, without more, as essentially mental processes within the abstract-idea category.” Electric Power Group, at 1354. In addition, the Appellants argue, “as in Vanda, the applicants’ claims specifically recite an application of the results of the analysis to determine whether the additional clinical data is to be requested and received.” Reply Br. 6.4 The apparent “application” of the analysis in claim 1 is that the system “presents the selected likelihood.” This is an output step. See Bilski, 561 U.S. at 610-11. The “application” in claim 16 is not apparent, because the last recited step, “using this stratification to select a final diagnosis,” is part of the analysis specified in the algorithm. Asserting that the claims also provide “something more” beyond an abstract idea, the Appellants repeat a previous argument: “the applicants’ 4 “Vanda” is cited only as “Vanda Pharmaceuticals, Inc. v. West-Ward Pharmaceuticals, Inc, (13 April 2018),” without any further reference. Reply Br. 6. Appeal 2018-005280 Application 13/061,959 22 process requires the use of a computer to provide a diagnosis in a reasonable time period, and is not merely instructions to implement the idea on a computer.” Appeal Br. 13. As we have explained, “the fact that the required calculations could be performed more efficiently via a computer does not materially alter the patent eligibility of the claimed subject matter.’” FairWarning IP, LLC v. Iatric Systems, Inc., 839 F.3d 1089, 1098 (Fed. Cir. 2016). We are also unpersuaded by the Appellants’ argument that “the generation of hypothesized values for missing data elements to determine, a priori, whether providing the actual values for the missing data elements is not a function that is well-understood, routine, or conventional.” Appeal Br. 13. The precise language in claim 16 that corresponds to this argument is “requesting, by the processor, input of an actual value of the additional clinical information about the patient based on the plurality of diagnostic hypotheses.” This is merely a way of stating that the diagnostician may seek additional information if the diagnosis cannot be made based on the available information, which is inarguably a routine step taken by diagnosticians. The precise language in claim 1 that corresponds to this argument is: an interface engine that: determines whether a significant difference exists among the plurality of likelihoods and: if a significant difference exists among the plurality of likelihoods: requests and receives an actual value of the additional clinical data, and identifies a select likelihood of the plurality of likelihoods corresponding to a match between the actual value and one of the plurality of different hypothesized values. Appeal 2018-005280 Application 13/061,959 23 This language essentially means the same thing as in claim 16: seek additional information when a diagnosis cannot be made without it. This is unquestionably a common practice amongst medical professionals attempting to make a diagnosis, and is a step that is well-understood, routine, and conventional. Citing “Classen, 1066” that is asserted as quoting “Research Corporation,” the Appellants argue “providing a system that only requests and receives additional clinical data when it is determined that this additional data will affect the diagnosis of a patient having a particular illness is likely to be perceived in the marketplace as a specific improvement in the technology of medical diagnostic systems.” Reply Br. 4. Although we have no evidence to address this assertion, the law does not base patentable subject matter eligibility on what a “marketplace” may perceive in a product. We are also unpersuaded by the Appellants’ argument of “something more” by “the significance of being able to selectively determine whether to incur the time and cost required to obtain additional clinical data based on an a priori assessment of the effect that this additional clinical data will have on the outcome of the diagnosis.” Reply Br. 7; see also id. at 8 (“the applicants’ invention avoids the time and cost that is conventionally expended to obtain missing clinical data regardless of an a priori determination of whether the missing clinical data will significantly affect the diagnosis.”). Because “clinical data” may include whether a patient smokes, we fail to discern, for example, a significant savings in time or cost by asking a patient whether they smoke, which is an act within the scope of the claimed selective determination. See Spec. 3, lines 24–26; see also Spec. 7, lines 26–31. Appeal 2018-005280 Application 13/061,959 24 Appellants argue that the claims provide “something more” because “the determination that particular clinical data will significantly affect the diagnosis may result in conducting a test that otherwise might be omitted due to the time and/or cost associated with the test.” Reply Br. 8. This, however, is the mental decision making process that any health professional must make when deciding whether to request additional “clinical data” or order additional tests. The Appellants have failed to show error in the Examiner’s finding that claims 1–6 and 16 are directed to abstract ideas. Because we agree that these claims recite abstract mental processes, without being “integrated into a practical application” and reciting “something more” beyond the abstract idea that represents an “inventive concept,” we sustain the rejection of claims 1–6 and 16 under 35 U.S.C. § 101. DECISION We reverse the rejection of claims 1–6 and 16 as indefinite under 35 U.S.C. § 112, second paragraph. We affirm the rejection of claims 1–6 and 16 under 35 U.S.C. § 101. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation