Ex Parte DrinkardDownload PDFPatent Trial and Appeal BoardAug 23, 201713298416 (P.T.A.B. Aug. 23, 2017) Copy Citation United States Patent and Trademark Office UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O.Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/298,416 11/17/2011 John Drinkard 1002-0003 9256 117548 7590 08/25/2017 Murphy, Bilak & Homiller/Omron 1255 Crescent Green Suite 200 Cary, NC 27518 EXAMINER SAAVEDRA, EMILIO J ART UNIT PAPER NUMBER 2127 NOTIFICATION DATE DELIVERY MODE 08/25/2017 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): official@mbhiplaw.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte JOHN DRINKARD Appeal 2016-002746 Application 13/298,416 Technology Center 2100 Before JEFFREY S. SMITH, SHARON FENICK, and MICHAEL M. BARRY, Administrative Patent Judges. BARRY, Administrative Patent Judge. DECISION ON APPEAL Appellant1 appeals under 35 U.S.C. § 134(a) from a final rejection of claims 1—14. We have jurisdiction under 35 U.S.C. § 6(b). We affirm. 1 Appellant identifies Omron Scientific Technologies, Inc. as the real party in interest. App. Br. 2. Appeal 2016-002746 Application 13/298,416 Introduction Appellant’s Specification states the “invention generally relates to monitoring zones, such as the area or volume around a hazardous machine, secure location, or Autonomous Guided Vehicle (AGV), and particularly relates to the use of multiple sensors for zone monitoring.” Spec. 12. There are two independent claims, 1 and 10 (respectively, an apparatus claim and a method claim), that are “directed to substantially the same subject matter.” App. Br. 11. Claims 1 and 10 recite: 1. A monitoring apparatus configured to detect objects within an area or volume referred to as a monitoring zone and to detect intrusions of objects according to configured boundaries defined with respect to the monitoring zone, the monitoring apparatus comprising: a plurality of sensors, each configured to monitor for intrusions according to a configured boundary and each sensor having a communication interface to receive configuration data defining the configured boundary to be monitored by the sensor and send intrusion detection information responsive to detecting object intrusions according to the configured boundary; and a control unit comprising: a configuration interface to receive control unit configuration data, the control unit configuration data defining control unit behavior with respect to each sensor, including defining a control response by the control unit with respect to the intrusion detection information received from each sensor; a communication interface communicatively coupling the control unit to the plurality of sensors and configured to send sensor configuration data to corresponding ones among the plurality of sensors, to thereby define the configured boundary at each sensor, and to receive the intrusion detection information from each sensor; 2 Appeal 2016-002746 Application 13/298,416 a number of outputs configured to provide signals to external devices or systems; and one or more processing circuits configured to control the outputs responsive to the intrusion detection information from each sensor, in accordance with the control response defined for the sensor. 10. A method of monitoring an area or volume referred to as a monitoring zone and detecting intrusions of objects according to configured boundaries defined with respect to the monitoring zone, the method comprising: sending sensor configuration data from a control unit to respective sensors among a plurality of sensors that are communicatively linked to the control unit, to configure a monitoring boundary used by each respective sensor for detecting object intrusions into least a portion of the monitoring zone; receiving control unit configuration data at the control unit, and configuring a control response of the control unit with respect to each sensor according to the control unit configuration data; and receiving intrusion detection information from given ones of the sensors, and controlling one or more outputs of the control unit according to the control response defined for the sensors from which the intrusion detection information is received. App. Br. 21 (Claims App’x). The Rejection Claims 1—14 stand rejected under 35 U.S.C. § 102(e) as anticipated by Hellickson (US 2010/0053330 Al; Mar. 4, 2010). Final Act. 2-11. ISSUES Based on Appellant’s arguments, the five issues before us are whether the Examiner errs in the rejection of: (1) claims 1 and 10, (2) claim 6, 3 Appeal 2016-002746 Application 13/298,416 (3) claims 7 and 11, (4) claims 8 and 13, and (5) claims 9 and 14. App. Br. 6-17. ANALYSIS We have reviewed the Examiner’s rejections in light of Appellant’s contentions of reversible error. We disagree with Appellant’s conclusions. Instead, we adopt the Examiner’s findings and reasons as set forth in the Final Rejection from which this appeal is taken and as set forth in the Answer. See Final Act. 2— 11, Ans. 2—10. We highlight the following for emphasis. Claims 1 and 10 Appellant argues “there is not a single sentence, drawing detail, or other item in Hellickson that discloses a control unit having a communication interface configured for distributing configured boundary information to each sensor among a plurality of sensors,” as required by claim 1. App. Br. 8. We disagree. Claim 1 recites, in relevant part, that the control unit includes “a communication interface communicatively coupling the control unit to the plurality of sensors and configured to send sensor configuration data to corresponding ones among the plurality of sensors, to thereby define the configured boundary at each sensor.” App. Br. 18 (Claims App’x). We agree with the Examiner that Hellickson’s disclosure maps to the recited claim requirements as follows: (1) Hellickson’s “central command and control facility” maps to the recited “control unit” (see 129); (2) Hellickson’s “security sensors” map to the recited “plurality of sensors” (see 18, 29, 34); and 4 Appeal 2016-002746 Application 13/298,416 (3) Hellickson’s sending of control data to camera sensors, e.g., to pan or zoom the camera, maps to the control unit being “configured to send sensor configuration data to corresponding ones among the plurality of sensors, to thereby define the configured boundary at each sensor” (see 1121, 27, 67—68, Fig. 9 (an ordinarily skilled artisan would have understood the sensor fusion and situational awareness functions in Hellickson’s Figure 9 to be part of the command and control facility)). Appellant argues “Hellickson does not teach that there is any control unit that is communicatively linked to all such sensors and configured to send each sensor’s protected boundary definition information. Nor is it inherent that such an arrangement be present in Hellickson.” App. Br. 8. Appellant’s reference to a “protected” boundary is not commensurate with the claim 1, which, in relevant part, requires each sensor have an “interface to receive configuration data defining the configured boundary to be monitored by the sensor.” Hellickson’s command and control facility (“control unit”) clearly is linked to and communicates with its security sensors. Thus, a skilled artisan would have understood the control facility and the sensors in Hellickson each necessarily include the “communication interfaces,” as recited in claim 1. We agree with the Examiner’s finding that a command data sent to a camera sensor to adjust its field of view constitutes “configuration data defining the configured boundary.” See Final Act. 3 (citing Hellickson ^Hf 19, 27, 67 for this limitation). We also agree with the Examiner’s explanation that, under a broad but reasonable interpretation, in view of Appellant’s Specification: [t]he field of view (i.e., “window”) of each sensor is ... a “boundary” since a view that is limited to a portion of a room would inherently have specific viewing boundaries. Additionally, if the sensor’s viewing angle or “window” can be adjusted, then the field of view is configurable (i.e., “a 5 Appeal 2016-002746 Application 13/298,416 configured boundary”). The detection is with respect to such configured boundary because an intrusion will be detected when it within said boundary. Furthermore, since a camera provides a view of its field of view to a monitoring center, then this view is data of the configuration boundary. Ans. 3. Thus, Appellant does not persuade us the Examiner errs in finding Hellickson discloses a control unit having a communication interface configured for sending configuration data to thereby define the configured boundary at each sensor, as recited in claim 1. Appellant also does not persuade us the Examiner errs in finding Hellickson discloses the corresponding method step requirements in claim 10, which we note does not recite “a communication interface.” Appellant further argues Hellickson “does not teach or suggest a control unit that includes a configuration interface configured to receive sensor configuration data defining specific control behavior to be exercised with respect to different ones of Hellickson’s sensors.” App. Br. 10. Appellant contends that “to the extent that Hellickson discusses responding to detecting signals from its sensors, it generically describes alarms, camera controls, etc., which appear to be common to all sensors and not defined with respect to any particular sensor.” Id. This is unpersuasive. The relevant requirement in claim 1 is for the control unit to have an interface to receive “configuration data defining control unit behavior with respect to each sensor, including defining a control response by the control unit with respect to the intrusion detection information received from each sensor.” App. Br. 18 (Claims App’x). Appellant’s Specification does not limit the meaning of “control unit configuration data.” Because Hellickson uses command data (“configuration data”) to zoom or pan its camera sensors 6 Appeal 2016-002746 Application 13/298,416 (“defining control unit behavior with respect to each sensor”) based on detecting intruders from the sensor data (“with respect to the intrusion detection information received from each sensor”), Appellant does not persuade us the Examiner errs in finding Hellickson discloses this limitation. See Hellickson || 20-33; see also Final Act. 3 and Ans. 8—10. Appellant also does not persuade us the Examiner errs in finding Hellickson discloses the corresponding method step requirements in claim 10, which we note does not recite “a configuration interface.” Accordingly, we sustain the Examiner’s § 102 rejection of claims 1 and 10, and also of claims 2—5 and 11, for which Appellant offers no substantive arguments separate from those for claims 1 and 10. Claim 6 Claim 6 depends from claim 1 and recites “wherein the control unit is configured to provide a user configuration interface and to set or adjust the configured boundaries of the sensors based at least in part on user inputs received via the user configuration interface.” App. Br. 20 (Claims App’x). Appellant argues Hellickson is silent as to a “user configuration interface,” as recited, and that Hellickson does not disclose the added requirements of claim 6 based on claim 1 ’s requirement for the control unit to send boundary configuration data to the sensors. App. Br. 11—12. As the Examiner finds, however, and we agree, Hellickson’s operator interface for controlling the field of view of camera sensors discloses this requirement. Final Act. 6 (citing Hellickson H 21, 27, 56); see also Ans. 7 (additionally citing Hellickson || 40-41). In other words, Hellickson’s teaching of allowing a user to adjust a camera’s field of view (e.g., 127) discloses adjusting the configured boundary for that sensor. 7 Appeal 2016-002746 Application 13/298,416 Accordingly, we sustain the Examiner’s § 102 rejection of claim 6. Claims 7 and 12 Claim 7 depends from claim 6 and recites that the control unit is further configured to: receive measurement data corresponding to field of view data from a first one of the sensors; receive or generate data representing a displayed boundary representing the configured boundary of a second one of the sensors as seen from the perspective of the first sensor; provide the field of view data and the data representing the displayed boundary via the user configuration interface; adjust the data representing the displayed boundary responsive to user inputs; and adjust the configured boundary of the second sensor in accordance with the adjustments made to the displayed boundary. App. Br. 20 (Claims App’x). Appellant argues the Examiner errs in finding Hellickson discloses these requirements because Hellickson discloses “display of actual detected target data or ‘situational awareness’ data associated with the detection of a target,” which “does not describe anything remotely like the claimed displaying and adjusting of sensor boundary information.” App. Br. 13. We disagree. As the Examiner finds, and we agree, Hellickson discloses claim 7’s recited requirement to “adjust the data representing the displayed boundary responsive to user inputs” by disclosing configuring its cameras to monitor “areas outside of originally marked monitoring area.” Final Act. 7 (citing Hellickson H 21, 27, 56, 67—68) (emphasis omitted) (as discussed above, the bounds of a displayed field of view from a camera correspond to a 8 Appeal 2016-002746 Application 13/298,416 recited displayed boundary). Also, by the plain meaning of the requirement to “adjust the configured boundary of the second sensor in accordance with the adjustments made to the displayed boundary,” Hellickson’s cameras’ fields of view are adjusted in response to user input to slew, pan, tilt, or zoom the cameras. See, e.g., Hellickson H 21, 27. Regarding claim 7’s requirement to “receive measurement data,” in discussing the breadth and interpretation the claim terms, the Examiner finds paragraph 21 of Hellickson discloses the recited requirement by disclosing “provid[ing] for measurements to be obtained by a sensor device of a target object in the field of view of the sensor.” Ans. 5—6. Appellant does not dispute this finding. See Reply Br. 5. Cumulative to the Examiner’s findings, we highlight the received measurement data does not affect any subsequent method steps of the recited claim. Regarding the requirement to “receive or generate data representing a displayed boundary representing the configured boundary of a second one of the sensors as seen from the perspective of the first sensor,” as recited, the Examiner finds that two sensors with overlapping fields of view discloses this requirement. Final Act. 6 (citing Hellickson || 41—42); Ans. 7—8 (citing, inter alia, Hellickson Fig. 8). Appellant does not persuade us the Examiner errs in this finding. See App. Br. 12—13; Reply Br. 5. Appellant does not explain how or why, given a camera’s field of view discloses a boundary, this requirement of claim 7 does not encompass Hellickson’s receiving and generating of field of view data from one camera’s view of another camera’s view. In particular, Appellant does not persuasively distinguish this claim limitation from Figure 8 of Hellickson, which shows 9 Appeal 2016-002746 Application 13/298,416 the view A2 from sensor A has a portion of B1 of sensor B’s field of view, as seen from the perspective of sensor A. See Ans. 7—8. Accordingly, we sustain the Examiner’s § 102 rejection of claim 7, and also of claim 12, which depends from claim 10 through claim 11, and which Appellant argues together with claim 2. Claims 8 and 13 Claim 8 depends from claim 1 and recites “wherein the outputs include one or more safety-critical outputs, and wherein the one or more processing circuits control the safety critical outputs responsive to the intrusion detection information from a particular sensor, in accordance with the control response defined for that particular sensor.” App. Br. 20 (Claims App ’x). The Examiner finds Hellickson’s “[ojutputs to alarms and armament equipment” constitute the safety critical outputs, and that Hellickson discloses controlling these according to the language of claim 8. Final Act. 7 (citing Hellickson || 19, 21, 30, 77) (emphasis omitted). Appellant contends: Hellickson does not discuss safety-critical outputs or suggest having a number of control outputs among which one or more are considered as being safety-critical. Nor does Hellickson teach or suggest any type of control unit that receives configuration data that defines the control unit behavior with respect to each sensor, including defining the particular control response to be undertaken by the control unit in response to receiving intrusion detection from a particular sensor. At most, Hellickson broadly discusses “a control signal” such as may be used to control one or more cameras, or an “overall alarm” which by its labeling would seem to be a global or general output irrespective of the particular sensors involved. App. Br. 14. 10 Appeal 2016-002746 Application 13/298,416 This is unpersuasive. An ordinarily skilled artisan would have understood Hellickson’s alarm to be a “safety-critical output.” Also, by reciting “one or more safety-critical outputs,” claim 8 does not require more than one of claim 1 ’s “number of outputs configured to provide signals to external devices or systems” be “safety-critical.” The ordinarily skilled artisan would have understood Hellickson’s command and control center includes “processing circuits” for its disclosed functionality, such as to provide “different alarm levels based on the exact location and direction of travel of a target” (121). We agree with the Examiner that such tracking of such targets (or “intruding targets” (128)) for alarm purposes by any given sensor constitutes “control[ling] the safety critical outputs responsive to the intrusion detection information from a particular sensor, in accordance with the control response defined for that particular sensor,” as recited by claim 8. Accordingly, we sustain the Examiner’s § 102 rejection of claim 8, and also of claim 13, which Appellant argues together with claim 8. Claims 9 and 14 Claim 9 depends from claim 1 and requires the control unit be “configured to identify the sensor from which intrusion detection information is received, based on identifiers uniquely identifying each sensor among the plurality of sensors, and is further configured to look up or otherwise select the control response based on the identifier associated with the intrusion detection information received at any given time by the control unit.” App. Br. 21 (Claims App’x). The Examiner finds Hellickson’s sensors that include GPS location signals disclose the recited “identifiers uniquely identifying each sensor,” and that target tracking and identifying “friend or foe” using this sensor information discloses the recited “control 11 Appeal 2016-002746 Application 13/298,416 response based on the identifier associated with the intrusion detection information received at any given time,” as recited. Final Act. 7—8 (citing Hellickson H 21, 27, 29-30, 34, 39, 43). Appellant argues the Examiner errs because Hellickson only broadly refers to the generation of camera control signals and suggests, such as in paragraph [0021] that geo-located data (i.e., for a detected intrusion) can be used to slew, pan, tilt, or zoom cameras for a human operator. It would seem, then, that this example suggests “camera control” as a generalized or common response for all incoming sensor data. App. Br. 15. Appellant reads Hellickson too narrowly. Hellickson’s paragraph 21 discusses using sensor data output for a variety of purposes, such as target identification and generating alarms, and does not limit this to doing so for a human operator or for only generalized responses. Hellickson makes clear that software can be used to automate the features its invention (Hellickson 177), such as “for moving the one or more video cameras in a direction of the one or more potential intruders” {Id. at claim 3). The ordinarily skilled artisan would have understood that identifying the camera and determining a “control response” for that camera is part of the software control for zooming, panning, or slewing a camera to track a target. Thus, we agree with the Examiner that Hellickson discloses the limitations of “look[ing] up or otherwise selecting] the control response based on the identifier associated with the intrusion detection information,” as recited by claim 9. Accordingly, we sustain the Examiner’s § 102 rejection of claim 9, and also of claim 14, which Appellant argues together with claim 9. 12 Appeal 2016-002746 Application 13/298,416 DECISION For the above reasons, we affirm the rejection of claims 1—14 under 35 U.S.C. § 102(e). No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l)(iv). AFFIRMED 13 Copy with citationCopy as parenthetical citation