There is provided a scanning microscope that samples light from a sample irradiated with pulsed light and converts the light into data for each pixel, in which a sampling time per pixel is substantially an integer multiple of a pulse period of the pulsed light, and a sampling start timing of each pixel is synchronized with a pixel synchronization signal.
H04N 25/40 - Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
H04N 25/76 - Addressed sensors, e.g. MOS or CMOS sensors
H04N 25/78 - Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
2.
SCANNING MICROSCOPE, PIXEL GENERATION METHOD, AND STORAGE MEDIUM
A scanning microscope includes: a photodetector that detects light from a sample; a sampling circuit that samples an output signal of the photodetector; and a processor. The processor performs an operation including at least integration on a plurality of items of sampling data sampled by the sampling circuit, determines whether a result of the operation is noise or a signal, and generates pixel data based on a result of the operation and a result of the determination.
An image processing system includes a scanning unit, a pixelated photon detector (PPD), a memory in which a plurality of tables in which an intensity gradation value and an output gradation value are associated with each other are stored, and a processor. The processor generates first image data in which the intensity gradation value indicated by a light intensity signal output from the PPD is set as a pixel value of a constituent pixel based on the light intensity signal and a scanning position of the scanning unit, generates second image data in which the output gradation value is set as a pixel value of a constituent pixel by converting the first image data based on one table selected from the plurality of tables stored in the memory, and causes a display unit to display an image represented by the second image data.
A processor of a three-dimensional data generation system is configured to determine whether two or more images include a graphics region in which graphics information is superimposed. The processor is configured to execute first processing when the two or more images do not include the graphics region. The first processing includes generation processing of generating the three-dimensional data by using the two or more images. The processor is configured to execute second processing when at least one image of the two or more images includes the graphics region. The second processing includes processing of preventing the graphics region in the at least one image from contributing to generation of the three-dimensional data.
G06T 5/50 - Image enhancement or restoration using two or more images, e.g. averaging or subtraction
G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersectionsConnectivity analysis, e.g. of connected components
5.
MICROSCOPE SYSTEM, METHOD OF IMAGE PROCESSING, AND STORAGE MEDIUM
A microscope system includes: a microscope that forms an optical observation image; an image capturing device that captures the optical observation image and generates an observation image; a processor that generates a histogram plotting pixel values of every color component in the observation image, determines a black balance adjustment value based on the histogram plotting pixel values of every color component, and subtracts the black balance adjustment value from pixel values of every color component in each pixel of the observation image to generate an adjusted observation image; and an output device that outputs the adjusted observation image.
This microscope system comprises: a microscope optical system that includes an eyepiece lens and forms an intermediate image of an observation object on an object side of the eyepiece lens; a sensing device (500) that is different from a microscope camera; a control unit that outputs an auxiliary image including work assistance information which is determined on the basis of a work condition with respect to the observation object, said work condition being specified from work information acquired by the sensing device (500), and reference information corresponding to the work condition, said reference information being acquired from a storage unit; and a superimposition device that superimposes the auxiliary image outputted by the control unit on the image plane on which the intermediate image is formed.
A microscope system includes a microscope, a digital camera configured to image an object through the microscope, and a processor. The processor is configured to perform scene recognition based on an image of the object obtained by the digital camera, using a machine learning model that has learned a plurality of scenes, to perform scene determination based on a result of the scene recognition, to temporally stabilize a result of the scene determination, and to change settings of the digital camera based on the result of the scene determination.
A work improvement support system (1) comprises: a work data generation unit (21) that generates work data by attaching, to a microscope image acquired by a microscope device (10) during the work of assembling or inspecting equipment using the microscope device (10), tag information relating to the work at the time of the acquisition of the microscope image; a quality information generation unit (22) that, on the basis of a plurality of sets of work data generated by the work data generation unit (21), generates quality information relating to the quality of the work for each work data classification classified by the tag information; and a display control unit (23) that displays the quality information generated by the quality information generation unit (22) on a display device (30).
An observation apparatus includes a display apparatus that displays a display pattern, a display projection optical system that projects a light beam from the display apparatus and forms an image of the display pattern, a combining optical element that combines a light beam from a sample and the light beam from the display apparatus, and an eyepiece optical system through which an image of the sample and an image of the display pattern are simultaneously observable by an observer. A numerical aperture (NA) of the light beam from the display apparatus is smaller than a maximum value and larger than a minimum value of an NA of the light beam from the sample, at a position of an image on an optical path that is formed after the light beams from the sample and the display apparatus are combined by the combining optical element.
The light measurement device includes: a photodetector that detects pulsed signal light and outputs a detection signal including an exponential response; an A/D conversion circuit that converts the detection signal into a digital signal; and a processor that performs setting of a transformation matrix for inversely converting the digital signal, and inverse conversion of the digital signal by the set transformation matrix to calculate an estimated pulse of the signal light.
A three-dimensional data generation method includes the following processing. A processor acquires two or more images of a component inside a turbine. The processor detects two or more correspondence regions that are the same regions in at least two images. The processor determines whether at least part of a region of each image is a change region or a non-change region. The processor generates three-dimensional data by using a correspondence region determined to be the change region without using a correspondence region determined to be the non-change region.
09 - Scientific and electric apparatus and instruments
Goods & Services
Software applications for industrial endoscopes; computer software; endoscopic equipment for industrial purposes.
14.
REFRACTIVE INDEX DISTRIBUTION GENERATION DEVICE, REFRACTIVE INDEX DISTRIBUTION GENERATION METHOD, REFRACTIVE INDEX DISTRIBUTION GENERATION SYSTEM, AND RECORDING MEDIUM
A refractive index distribution generation device includes a processor and a memory. The processor performs a refractive index distribution generation process of generating a refractive index distribution corresponding to a processing-target image. The refractive index distribution generation process includes an input process of inputting, from the memory, the processing-target image, first refractive index information indicating a refractive index of a first structure, and second refractive index information indicating a refractive index of a second structure different from the first structure, and a setting process of setting respective refractive indexes that constitute a refractive index distribution. The setting process includes a first setting process of setting a refractive index based on the first refractive index information at a position corresponding to a first image region of the processing-target image on the basis of a signal intensity, and a second setting process of setting a refractive index based on the second refractive index information at a position corresponding to an image region different from the first image region of the processing-target image. The first image region is an image region corresponding to the first structure.
G01N 21/27 - ColourSpectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection
15.
OBSERVATION SYSTEM, FOCUS POSITION CALCULATION METHOD, AND COMPUTER READABLE MEDIUM
An observation system includes: an illumination device that illuminates an observation object with illumination light from a plurality of different directions; an imaging device that includes an objective that condenses light from the observation object and images the observation object with the light collected by the objective; and a control unit. The control unit calculates a focus position of the imaging device based on a plurality of image groups acquired by the imaging device at observation positions different from each other in an optical axis direction of the objective, each of the plurality of image groups including a plurality of images of the observation object illuminated from directions different from each other by the illumination device.
A sample image generation device includes a memory and a processor. A first image is an image obtained by capturing an image of a sample. A predetermined direction in the first image is a direction in which a virtual observation optical system is present among optical axis directions of the virtual observation optical system. The processor acquires the first image from the memory, divides the first image into a plurality of areas, acquires a refractive index distribution of the sample from the memory, calculates a point spread function using the refractive index distribution, generates second images using the point spread function, combines the second images, and generates a third image corresponding to the first image. In the calculation process, the point spread function of a first area is calculated using the refractive index distribution of each area included in an area group. The first area is an area for which the point spread function is to be calculated. The area group is constituted of a plurality of areas inside a range in which light rays originating from the first area radiate in the predetermined direction.
An objective includes a positive first lens group, a negative second lens group, and a positive third lens group including two or more lens components, in which the first, second, and third lens groups are sequentially disposed from an object side. The first lens group includes a meniscus-shaped first lens having positive refractive power, in which the first lens has a concave surface facing an image side, and a meniscus-shaped second lens having positive refractive power, in which the second lens has a concave surface facing the image side. The second lens group includes two or more positive lenses and one or more negative lenses. The total number of lenses included in the second lens group is seven or more. The objective satisfies the following conditional expressions.
An objective includes a positive first lens group, a negative second lens group, and a positive third lens group including two or more lens components, in which the first, second, and third lens groups are sequentially disposed from an object side. The first lens group includes a meniscus-shaped first lens having positive refractive power, in which the first lens has a concave surface facing an image side, and a meniscus-shaped second lens having positive refractive power, in which the second lens has a concave surface facing the image side. The second lens group includes two or more positive lenses and one or more negative lenses. The total number of lenses included in the second lens group is seven or more. The objective satisfies the following conditional expressions.
1.6
≤
fL
/
TTL
≤
5
(
1
)
2
≤
ER
1
F
/
ER
2
F
(
2
)
1.29
≤
ER
3
F
/
ER
2
R
(
3
)
G02B 9/14 - Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or – having three components only arranged + – +
This inspection assistance system has an image sensor and a control unit. The control unit acquires two or more first images from the image sensor in response to the rotation of a rotating body. The control unit adds observation information indicating that observation is necessary to at least one first image among the two or more first images. The control unit outputs a control signal to a turning tool in order for the insertion part to capture, in the field of view, an object appearing in the at least one first image. After the turning tool rotates the rotating body, the control unit acquires at least one second image from the image sensor.
09 - Scientific and electric apparatus and instruments
Goods & Services
downloadable software applications for displaying images and videos of objects acquired by industrial endoscopes and generating and measuring 3D data; downloadable computer software for displaying images and videos of objects acquired by industrial endoscopes and generating and measuring 3D data
25.
OBSERVATION DEVICE, OBSERVATION SYSTEM, AND METHOD FOR CONTROLLING OBSERVATION DEVICE
An observation device includes: an illumination system that is disposed on a downward side of a sample and emits illumination light from the downward side toward an upward side of the sample; an imaging system that is disposed on the downward side and images the sample with transmitted light that is reflected on the upward side in the illumination light emitted from the illumination system and is transmitted through the sample from the upward side to the downward side; and a control unit that executes imaging control using the illumination system and the imaging system. The control unit controls an exposure amount in correspondence with required time for the imaging control which includes required operation time and required pause time determined based on imaging conditions.
A microscope image of a cultured cell cluster derived from a cancer specimen of a patient is acquired. A measured value of a gene expression level of the cluster is acquired. Based on the image, a morphological representation identifiably expressing, by a vector quantity of a plurality of dimensions, a morphological difference between a group of cell clusters cultured from the same cancer specimen and a group of cell clusters cultured from another cancer specimen is acquired. The acquired morphological representation is input to a function, which is obtained by fitting the measured value with respect to the morphological representation, to acquire a prediction value of the gene expression level. Prediction accuracy is estimated based on the prediction value and the measured value. Based on the estimated prediction accuracy, a gene related to a morphological change of the cell cluster is extracted as a gene candidate.
G16H 20/10 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data miningICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
An endoscope system according to the present invention has: an endoscope device having a bendable insertion part; an operation device; and a control unit. A user interface of the operation device generates a first signal according to a state of the user interface that changes when contact is made with an object. A motion sensor of the operation device generates a second signal according to a physical motion of the operation device. The control unit calculates, on the basis of the first signal, a first control value to be used for first control. The control unit calculates, on the basis of the second signal, a second control value to be used for second control. At least one of the first control and the second control is performed to bend the insertion part.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
G02B 23/24 - Instruments for viewing the inside of hollow bodies, e.g. fibrescopes
29.
IMAGE DISPLAY METHOD, DISPLAY CONTROL DEVICE, AND RECORDING MEDIUM
A processor causes a storage medium to store three-dimensional data of a subject in a storage step. The processor selects a reference image in a first selection step. The processor selects a selected image that is a two-dimensional image used for generating the three-dimensional data on the basis of the reference image in a second selection step. The processor estimates a second camera coordinate regarding the reference image on the basis of a first camera coordinate regarding the selected image in an estimation step. The processor displays an image of the subject on a display in a display step. The image of the subject visualizes at least one of the second camera coordinate and a set of three-dimensional coordinates of one or more points of the subject calculated on the basis of the second camera coordinate.
F02C 7/00 - Features, component parts, details or accessories, not provided for in, or of interest apart from, groups Air intakes for jet-propulsion plants
G06T 7/70 - Determining position or orientation of objects or cameras
30.
ENDOSCOPE DEVICE, METHOD OF OPERATING ENDOSCOPE DEVICE, AND RECORDING MEDIUM
An endoscope device includes an imaging device, a determination unit, and a control unit. The determination unit is configured to determine the position and the size of a region including the distal end of a treatment tool seen in an image. The control unit is configured to set a control region in the image based on the position and the size. The control unit is configured to execute exposure control of controlling brightness of the image by using a value of a pixel in the control region. The control unit is configured to display the control region on a display. The control region includes a first region and excludes a second region. The distal end is seen in the first region. Abase end of the treatment tool is seen in the second region. The second region is in contact with a boundary of the image.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time
H04N 23/74 - Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
A microscope apparatus includes a color temperature adjustment unit. The color temperature adjustment unit includes, in order from an incident side of light: a first polarizer that converts the light into linearly polarized light; a ¼-wave plate that converts at least light of a predetermined wavelength of the linearly polarized light into circularly polarized light, the ¼-wave plate being fixed in a predetermined orientation relative to the first polarizer; and a second polarizer that extracts a predetermined polarization component, the second polarizer being disposed to be rotatable relative to the first polarizer. A difference between a phase amount with respect to light of 435 nm and a phase amount with respect to light of 635 nm of the ¼-wave plate is 30 degrees or more.
An objective includes a positive first lens group, a negative second lens group having one or more aspheric surface, a third lens group, and a positive fourth lens group consists of two lens components; wherein,
An objective includes a positive first lens group, a negative second lens group having one or more aspheric surface, a third lens group, and a positive fourth lens group consists of two lens components; wherein,
1.6≤fL/TTL≤5 (1)
An objective includes a positive first lens group, a negative second lens group having one or more aspheric surface, a third lens group, and a positive fourth lens group consists of two lens components; wherein,
1.6≤fL/TTL≤5 (1)
1.5≤ER1/ER2 (2)
An objective includes a positive first lens group, a negative second lens group having one or more aspheric surface, a third lens group, and a positive fourth lens group consists of two lens components; wherein,
1.6≤fL/TTL≤5 (1)
1.5≤ER1/ER2 (2)
D2/OTTL≤0.6 (3) are satisfied.
An objective includes a positive first lens group, a negative second lens group having one or more aspheric surface, a third lens group, and a positive fourth lens group consists of two lens components; wherein,
1.6≤fL/TTL≤5 (1)
1.5≤ER1/ER2 (2)
D2/OTTL≤0.6 (3) are satisfied.
Herein, fL is a focal length. TTL is the distance from the surface closest to the object side to the surface closest to the image side. ER1 is an effective radius of the surface closest to the object side. ER2 is an effective radius of the surface of the second lens group closest to the object side. D2 is the distance from the object plane to the surface of the second lens group closest to the image side. OTTL is the distance from the object plane to the surface closest to the image side.
G02B 9/64 - Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or – having more than six components
G02B 13/02 - Telephoto objectives, i.e. systems of the type + – in which the distance from the front vertex to the image plane is less than the equivalent focal length
G02B 13/18 - Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
A sample observation apparatus includes a light source unit, an illumination optical system, an observation optical system, a light detection element, a scanning unit, a holding member, and an image processing device. A light spot is formed by the illumination optical system. The scanning unit moves the light spot and the holding member relative to each other. A pupil of the observation optical system and the light detection element are positioned at a position conjugate to a pupil position of the illumination optical system. The image processing device generates an image of a sample based on a predetermined image and a filter, and the predetermined image is an image based on a signal output from the light detection element. The filter includes a first region and a second region, and a value in the first region is greater than a value in the second region.
An experiment support apparatus includes: one or more non-transitory computer-readable media that include an instruction; and one or more processors that execute the instruction. The instruction is configured to cause the one or more processors to execute an operation, the operation includes: causing a display device to display a condition table TB that indicates an experiment condition for measurement, in response to an input of the experiment condition; and causing the display device to display at least one of a measurement result based on measurement data, or an analysis result of the measurement data, in a cell of the condition table TB, in response to an input of the measurement data, the measurement data being obtained using a measurement apparatus under the experiment condition corresponding to the cell.
This information processing device comprises a storage unit and a control unit. The storage unit stores a plurality of pieces of image data each having a family ID that is not updated when an image is processed and the number of which is uniquely selected only when a test device acquires an image. In accordance with a request from a terminal having a display device, the control unit outputs, to the terminal, display information for displaying one or more pieces of image data (1093, 1093a, 1093b, 1094, 1094a) included in a data set (A32) as a list on a screen, the one or more pieces of image data being from the plurality of pieces of image data. The display information includes group display information (1098, 1099) that groups, by family ID, one or more pieces of image data included in the data set (A32).
In a three-dimensional image display method, a processor acquires first three-dimensional data and second three-dimensional data of a subject from a recording medium. The processor converts a first three-dimensional coordinate system of the first three-dimensional data and a second three-dimensional coordinate system of the second three-dimensional data into a three-dimensional common coordinate system on the basis of structure information related to a geometric structure of the subject. The processor displays an image of the first three-dimensional data in the common coordinate system and an image of the second three-dimensional data in the common coordinate system on a display.
H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
H04N 13/167 - Synchronising or controlling image signals
H04N 13/207 - Image signal generators using stereoscopic image cameras using a single 2D image sensor
H04N 13/361 - Reproducing mixed stereoscopic imagesReproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
37.
MICROSCOPE SYSTEM, SUPERIMPOSITION UNIT, SUPERIMPOSITION DISPLAY METHOD, AND PROGRAM
A microscope system (1) comprises: an eyepiece lens (30); an observation optical system (100); a reading device (50); a superimposition device (25); and a control device (70). The observation optical system (100) forms a specimen image, using observation light from a specimen (S), on an image surface on the object side of the eyepiece lens (30). The reading device (50) reads identification information attached to the specimen (S). The superimposition device (25) superimposes, onto the image surface, specimen information obtained on the basis of the identification information. By controlling the superimposition device, the control device (70) switches the display position of the specimen information on the image surface when prescribed conditions are met.
A three-dimensional reconstruction device includes a processor. The processor is configured to acquire position-and-orientation information and two-dimensional coordinate information. The processor is configured to acquire a three-dimensional size of a subject. The processor is configured to calculate a correction coefficient used for matching the size at the position of a camera to a known size. The processor is configured to correct the position of the camera by using the correction coefficient. The processor is configured to restore a three-dimensional shape of the subject by using the corrected position. the orientation of the camera at the position, and the two-dimensional coordinate information.
An observation system includes a mounting table on which a sample container is placed, a surface light source that is disposed in one of two regions divided by the mounting table and has a light emitting plane, an observation optical system disposed in the other thereof, a conveyance mechanism moving the observation optical system in a direction orthogonal to an optical axis of the observation optical system to change an observation position, and a controller controlling a light emission pattern defined by a light emitting region where light is emitted on the light emitting plane. The controller executes first light emission pattern control in which the light emission pattern is changed according to the observation position, or, alternatively, second light emission pattern control in which the light emission pattern is switched between a plurality of periodic light emission patterns having phases different from each other.
An estimation system has a memory and a processor. The memory stores first wave front information and second wave front information. The first wave front information is information on the wave front that is obtained based on first illumination light that has passed through an object. The second wave front information is information on the wave front that is obtained based on second illumination light that has passed through the object. The wavelength at which the intensity of the second illumination light is highest is shorter than the wavelength at which the intensity of the first illumination light is highest. The processor performs an estimation process of estimating a three-dimensional optical property of the object. The three-dimensional optical property is a refractive index distribution or an absorbance distribution. The estimation process uses both the first wave front information and the second wave front information.
G01N 21/45 - RefractivityPhase-affecting properties, e.g. optical path length using interferometric methodsRefractivityPhase-affecting properties, e.g. optical path length using Schlieren methods
A data processing method includes an input step S1 of inputting measurement data into a neural network, an estimation step S2 of generating estimation data from the measurement data, a restoration step S3 of generating restoration data from the estimation data, and a calculation step S4 of calculating a confidence level of the estimation data, based on the measurement data and the restoration data. The neural network is a trained model, the measurement data is data obtained by measuring light transmitted through an object, the estimation data is data of a three-dimensional optical characteristic of the object estimated from the measurement data, and the three-dimensional optical characteristic is a refractive index distribution or an absorptance distribution. In the estimation, the neural network is used, in the restoration, forward propagation operations are performed on the estimation data, and in the forward propagation operations, wavefronts passing through the interior of the object estimated from the measurement data are sequentially obtained in a direction in which light travels.
A microscope system includes an incoherent light source, a detection optical system, and an imager. The incoherent light source is a light source that emits light that is temporally not coherent. In a sample, a plurality of coherent illuminations are performed simultaneously by light emitted from the incoherent light source. The coherent illuminations are illumination by light that is spatially coherent. The direction in which the sample is irradiated with a light beam is different for each coherent illumination. In a pupil plane of the detection optical system, the respective light beams of the coherent illuminations pass through first regions different from each other. Each of the first regions satisfies the following Condition (1). At least one distance among distances between the two adjacent first regions satisfies the following Condition (2).
A microscope system includes an incoherent light source, a detection optical system, and an imager. The incoherent light source is a light source that emits light that is temporally not coherent. In a sample, a plurality of coherent illuminations are performed simultaneously by light emitted from the incoherent light source. The coherent illuminations are illumination by light that is spatially coherent. The direction in which the sample is irradiated with a light beam is different for each coherent illumination. In a pupil plane of the detection optical system, the respective light beams of the coherent illuminations pass through first regions different from each other. Each of the first regions satisfies the following Condition (1). At least one distance among distances between the two adjacent first regions satisfies the following Condition (2).
LS
An insertion state determination system includes a sensor unit including a first sensor and the system includes a second sensor and a processor. The first sensor is configured to determine a first rotation amount indicating a rotation amount of an elongated insertion unit of an endoscope device around a center axis of the insertion unit. A hole through which the insertion unit passes is formed in the sensor unit. The second sensor is disposed in the sensor unit or an object fixed to the sensor unit and is configured to determine a second rotation amount indicating a rotation amount of the sensor unit around the center axis when the insertion unit is inserted into the subject. The processor is configured to calculate a corrected rotation amount by correcting the first rotation amount based on the second rotation amount.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
44.
ENDOSCOPE DEVICE, ENDOSCOPE SYSTEM, CONVERGENCE ANGLE CHANGING METHOD, AND COMPUTER-READABLE MEDIUM
An endoscope device includes an imaging optical system including two incident light paths formed by two optical systems respectively having optical axes eccentric with respect to an imaging center of an imaging element, an optical action member configured to change a convergence angle, and a processor configured to control a state of the optical action member in the imaging optical system. The processor changes the convergence angle by controlling the state of the optical action member in the imaging optical system.
G02B 7/18 - Mountings, adjusting means, or light-tight connections, for optical elements for prismsMountings, adjusting means, or light-tight connections, for optical elements for mirrors
G02B 23/24 - Instruments for viewing the inside of hollow bodies, e.g. fibrescopes
G02B 7/28 - Systems for automatic generation of focusing signals
H04N 23/55 - Optical parts specially adapted for electronic image sensorsMounting thereof
A microscope system (1) comprises: a microscope (100) that forms an image of a sample containing sperm; an imaging device (143) that acquires the image of the sample; a processing device (200) that generates, on the basis of the acquired image, an auxiliary image containing information related to grading of the sperm; and a projection device (153) that superimposes the auxiliary image onto the image plane where the microscope (100) forms the image. The processing device (200) extracts at least one portion of the sperm from the image by utilizing a segmentation model generated by means of deep learning. The processing device (200) additionally grades the sperm on the basis of a measured value of a feature quantity measured from the at least one portion that was extracted and preregistered grading criteria indicating the relationship between the feature quantity and the grade.
An endoscopic device includes: an insertion portion that is inserted into a test subject and includes an imaging element; an operation receiving unit; and a processor. The processor, during recording processing of a moving image including a plurality of frame images generated on the basis of an imaging signal output from the imaging element, adds, as a tag, information regarding an operation to a frame image corresponding to a timing when the operation receiving unit receives the operation, and adds, as a tag, information regarding a specific feature image to a frame image recognized as the specific feature image, and extracts a frame image from among the plurality of frame images included in the moving image on the basis of a tag.
A61B 1/05 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
G06V 10/25 - Determination of region of interest [ROI] or a volume of interest [VOI]
G16H 30/20 - ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
A microscope system includes an eyepiece-side observation optical system that forms an image of a sample on an object side of an eyepiece, a camera-side observation optical system that forms an image of the sample on an image sensor, a diaphragm that limits a numerical aperture on the emission side of the camera-side observation optical system, and a processor that analyzes the image of the sample captured by the image sensor. In a case where a numerical aperture on an object side of an objective lens is denoted by NA, the numerical aperture on the emission side of the camera-side observation optical system determined by a light flux emitted from the camera-side observation optical system toward the image sensor at capturing the image is denoted by NA′, and a total magnification of the camera-side observation optical system is denoted by M1, the following is satisfied.
A microscope system includes an eyepiece-side observation optical system that forms an image of a sample on an object side of an eyepiece, a camera-side observation optical system that forms an image of the sample on an image sensor, a diaphragm that limits a numerical aperture on the emission side of the camera-side observation optical system, and a processor that analyzes the image of the sample captured by the image sensor. In a case where a numerical aperture on an object side of an objective lens is denoted by NA, the numerical aperture on the emission side of the camera-side observation optical system determined by a light flux emitted from the camera-side observation optical system toward the image sensor at capturing the image is denoted by NA′, and a total magnification of the camera-side observation optical system is denoted by M1, the following is satisfied.
M1×NA′
A lens-barrel device includes: a relay optical system that relays a primary image formed by an imaging lens to an object plane of an eyepiece to form a secondary image; a first reflection optical system that reflects and bends the light flux from the imaging lens; a second reflection optical system that reflects and bends the light flux that has passed through the first reflection optical system; a third reflection optical system that reflects and bends the light flux that has passed through the second reflection optical system; an additional optical system that transmits the light flux for forming an image different from the secondary image, on the object plane of the eyepiece; and a compositing optical element that guides the light flux from the additional optical system to a light path to the object plane of the eyepiece.
A laser scanning microscope includes a scanner that scans a sample with laser light; a detector having a silicon photomultiplier (SiPM); and a processor that executes image processing of removing dark count noise based on an appearance frequency of the dark count noise in the SiPM on a scanned image.
A microscope system includes an observation optical system that forms an optical image of a specimen on an object side of an ocular lens; a projection device that superimposes information on an image plane on which the optical image is formed, an imaging device that is provided on an imaging optical path branched from an observation optical path, and a processor. The processor controls the projection device to superimpose, on the image plane, focus information based on a captured image of the specimen acquired by the imaging device and analysis information regarding a result of image analysis on the captured image, the image analysis being different from focus analysis.
An objective includes: an objective barrel; a plurality of lens sub-assemblies stacked in an optical axis direction of the objective in the objective barrel, each of the plurality of lens sub-assemblies including a lens and a lens frame holding the lens; and a screw member screwed into the objective barrel, the screw member pressing the plurality of lens sub-assemblies against the objective barrel along the optical axis direction of the objective. The screw member has three or more concave portions into which a jig is fitted when the screw member is screwed into the objective barrel.
An ocular tube for a microscope, to which an ocular lens is to be attached, includes a superimposition device that superimposes an assistive image on an image plane on which an optical image is formed with light from the microscope.
A three-dimensional data generation method includes the following processing. A processor acquires two or more images of a component inside a turbine. The processor detects two or more correspondence regions that are the same regions in at least two images. The processor determines whether at least part of a region of each image is a change region or a non-change region. The processor generates three-dimensional data by using a correspondence region determined to be the change region without using a correspondence region determined to be the non-change region.
Provided is a specimen image generation device with which it is possible to recover an image in a highly accurate manner. This specimen image generation device 1 comprises a memory 2 and a processor 3. A first image is an image obtained by imaging a specimen, and a prescribed direction in the first image is the direction, from among the optical axis directions of a virtual observation optical system, in which the virtual observation optical system is present. The processor 3 acquires the first image from the memory 2, divides the first image into a plurality of areas, acquires the refractive index distribution of the specimen from the memory 2, calculates a point image intensity distribution using the refractive index distribution, generates a second image using the point image intensity distribution, synthesizes the second image, and generates a third image corresponding to the first image. In the calculation process, the point image intensity distribution of a first area is calculated using the refractive index distribution of areas included in an area group, the first area being an area for which the point image intensity distribution is to be calculated, and the area group being configured from a plurality of areas on the inside of a range in which a light beam is radiated in a prescribed direction from the first area.
G01N 21/27 - ColourSpectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection
REFRACTIVE INDEX DISTRIBUTION GENERATION DEVICE, REFRACTIVE INDEX DISTRIBUTION GENERATION METHOD, REFRACTIVE INDEX DISTRIBUTION GENERATION SYSTEM, AND RECORDING MEDIUM
Provided is a refractive index distribution generation device capable of improving the accuracy of a refractive index distribution of even a thick sample. This refractive index distribution generation device comprises a processor and a memory. The processor executes refractive index distribution generation processing of generating a refractive index distribution for a processing target image. The refractive index distribution generation processing includes: input processing of inputting, from a memory, the processing target image, first refractive index information indicating the refractive index of a first structure, and second refractive index information indicating the refractive index of a second structure different from the first structure; and setting processing of setting each refractive index constituting the refractive index distribution. The setting processing includes: first setting processing of setting a refractive index, which is based on the first refractive index information, at a position corresponding to a first image region of the processing target image on the basis of a signal strength; and second setting processing of setting a refractive index, which is based on the second refractive index information, at a position corresponding to an image region different from the first image region of the processing target image. The first image region is an image region corresponding to the first structure.
G01N 21/27 - ColourSpectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection
MOTORIZED CORRECTION COLLAR SYSTEM, METHOD OF CORRECTION COLLAR CALIBRATION PERFORMED BY MOTORIZED CORRECTION COLLAR SYSTEM, AND COMPUTER READABLE MEDIUM
A motorized correction collar system includes: an attachment portion; a control unit that controls transmission of force to a correction collar ring of a first objective lens attached to the attachment portion; and a storage unit that stores calibration information at least for each type of objective lens with a correction collar. The control unit acquires calibration information corresponding to the first objective lens from among the calibration information stored in the storage unit, and calibrates the correction collar of the first objective lens based on the calibration information.
An optical signal detection device includes an objective and a gel unit including a gel and an assisting member. The assisting member includes a first surface, a second surface, and an opening. The opening has a diameter smaller than an outer diameter of the gel and larger than an effective diameter of the objective. The first surface faces the gel, and a part of the first surface is fixed to a bottom surface of the gel by an adhesive force. A center of the bottom surface is not in contact with the assisting member. In a state where the gel unit is attached to a frame member of the objective, the second surface faces the frame member of the objective, a part of the first surface is fixed to the gel outside the effective diameter of the objective, and the gel is in close contact with the objective.
A microscope system includes: a microscope optical system that includes an ocular lens and forms an optical image of a sample on an object side of the ocular lens; a processor that generates auxiliary image data based on information regarding a target slide selected from among a plurality of ordered slides included in a slide set; and a superimposing device that superimposes, based on the auxiliary image data, an auxiliary image including the target slide on an image plane on which the optical image is formed. The processor selects, in response to an instruction to switch the target slide, a slide determined according to a first order in which the plurality of slides are ordered as a new target slide.
An observation apparatus includes: an illumination optical system provided below an installation position of a multi-well plate; a reflector provided above the installation position, the reflector being configured to reflect light emitted from the illumination optical system; and an observation optical system provided below the installation position, the observation optical system being configured to condense the light reflected by the reflector. The reflector is installed such that a marginal ray to enter the observation optical system travels via a peripheral well different from an on-axis well located on an optical axis of the observation optical system before reflection due to the reflector.
A movable optical unit configured to be rotatable around an axis by an electromagnet includes a fixed shaft, a bearing through which the fixed shaft is inserted and which is polarized in a direction orthogonal to a long axis of the fixed shaft, a holding frame that is provided to be rotatable around the fixed shaft and holds at least one optical member, and a pair of arm members extending outward from the holding frame in a direction orthogonal to the long axis of the fixed shaft, the pair of arm members being bonded to the bearing in a state of sandwiching the bearing in a direction along the long axis of the fixed shaft.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
G02B 7/04 - Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
G02B 23/24 - Instruments for viewing the inside of hollow bodies, e.g. fibrescopes
G02B 7/14 - Mountings, adjusting means, or light-tight connections, for optical elements for lenses adapted to interchange lenses
G02B 7/02 - Mountings, adjusting means, or light-tight connections, for optical elements for lenses
H01F 7/08 - ElectromagnetsActuators including electromagnets with armatures
A microscope system 1 comprises: a microscope 100 that forms an optical image of a sperm sample which contains an object of selection; an imaging device 143 for acquiring a digital image of the sperm sample; a processing device 200 that generates an object image which is an image of the object of selection, such generation being on the basis of sperm detection and sperm form evaluation with respect to the digital image; and a projection device 153 that displays the object image on an image surface where the optical image is formed, the object image being displayed larger than the object of selection in the optical image.
An image capturing device for observing a sample housed in a container to which identification information is attached, from below the container, includes: an image capturing unit including an image pickup element; a light guide unit that guides light from an identification surface to the image capturing unit, the identification surface being a surface of the container which differs from a bottom surface of the container and to which identification information is attached; and a mobile unit that changes a relative position of the image capturing unit with respect to the container. After the mobile unit changes the relative position to a first relative position in which the optical axis of the image capturing unit deviates from the container, the image capturing unit images the identification surface via the light guide unit, and, after the mobile unit changes the relative position to a second relative position in which the optical axis of the image capturing unit intersects the container, the image capturing unit images the sample via the bottom surface.
The objective includes a first lens group having positive power and a second lens group having negative power and including a pair of meniscus lens components having concave surfaces facing each other. The first lens group includes a first lens situated closest to an object side and has positive power with a concave surface facing the object side. The objective includes three or more cemented lenses arranged closer to the object side than the pair of meniscus lens components, and satisfies the following conditional expressions.
The objective includes a first lens group having positive power and a second lens group having negative power and including a pair of meniscus lens components having concave surfaces facing each other. The first lens group includes a first lens situated closest to an object side and has positive power with a concave surface facing the object side. The objective includes three or more cemented lenses arranged closer to the object side than the pair of meniscus lens components, and satisfies the following conditional expressions.
2.6≤φL1/DL1≤16 (1)
The objective includes a first lens group having positive power and a second lens group having negative power and including a pair of meniscus lens components having concave surfaces facing each other. The first lens group includes a first lens situated closest to an object side and has positive power with a concave surface facing the object side. The objective includes three or more cemented lenses arranged closer to the object side than the pair of meniscus lens components, and satisfies the following conditional expressions.
2.6≤φL1/DL1≤16 (1)
0.1≤|R212|/f≤3.5 (2)
The objective includes a first lens group having positive power and a second lens group having negative power and including a pair of meniscus lens components having concave surfaces facing each other. The first lens group includes a first lens situated closest to an object side and has positive power with a concave surface facing the object side. The objective includes three or more cemented lenses arranged closer to the object side than the pair of meniscus lens components, and satisfies the following conditional expressions.
2.6≤φL1/DL1≤16 (1)
0.1≤|R212|/f≤3.5 (2)
Here, φL1 and DL1 are an outer diameter and a thickness of the first lens. R212 is a radius of curvature of a surface closest to the image side in a first meniscus lens component among the pair of meniscus lens components. f is a focal length of the objective.
G02B 9/10 - Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or – having two components only one + and one – component
An objective includes a first lens group having positive power, a second lens group including a pair of lens components having concave surfaces facing each other, and a third lens group having positive power, and the pair of lens components is cemented lenses, and the following conditional expression is satisfied.
An objective includes a first lens group having positive power, a second lens group including a pair of lens components having concave surfaces facing each other, and a third lens group having positive power, and the pair of lens components is cemented lenses, and the following conditional expression is satisfied.
0.20≤d2/L≤0.5 (1)
where d2 is a distance between a surface situated closest to an image side in a first lens component and a surface situated closest to the object side in a second lens component, the first lens component being a lens component situated on the object side and the second lens component being a lens component situated on the image side of the pair of lens components. L is a distance between a surface situated closest to the object side and a surface situated closest to the image side in the objective.
G02B 9/12 - Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or – having three components only
A microscope objective includes: a first group including a plurality of meniscus lenses with a concave surface facing an object side; a second group including a first cemented lens closest to the object side; a third group consisting of a positive single lens; a fourth group including a first cemented meniscus lens with a concave surface facing the object side; a fifth group consisting of a second cemented meniscus lens with a concave surface facing an image side; and a sixth group including a third cemented meniscus lens with a concave surface facing the object side, the first to sixth groups being arranged in order from the object side. The microscope objective satisfies the following conditional expression.
A microscope objective includes: a first group including a plurality of meniscus lenses with a concave surface facing an object side; a second group including a first cemented lens closest to the object side; a third group consisting of a positive single lens; a fourth group including a first cemented meniscus lens with a concave surface facing the object side; a fifth group consisting of a second cemented meniscus lens with a concave surface facing an image side; and a sixth group including a third cemented meniscus lens with a concave surface facing the object side, the first to sixth groups being arranged in order from the object side. The microscope objective satisfies the following conditional expression.
0<|f/f5|<0.15 (1)
A microscope objective includes: a first group including a plurality of meniscus lenses with a concave surface facing an object side; a second group including a first cemented lens closest to the object side; a third group consisting of a positive single lens; a fourth group including a first cemented meniscus lens with a concave surface facing the object side; a fifth group consisting of a second cemented meniscus lens with a concave surface facing an image side; and a sixth group including a third cemented meniscus lens with a concave surface facing the object side, the first to sixth groups being arranged in order from the object side. The microscope objective satisfies the following conditional expression.
0<|f/f5|<0.15 (1)
Where, f is a focal length of the microscope objective. f5 is a focal length of the fifth group.
A microscope system includes: a microscope optical system including an ocular lens, the microscope optical system being configured to form an optical image of a sample on an object side of the ocular lens; a processor configured to generate, based on examination information regarding examination to the sample and magnification information regarding a magnification of the microscope optical system, image data of a comparative image for comparison with the optical image; and a superimposition device configured to superimpose, based on the image data generated by the processor, the comparative image onto an image plane on which the optical image is formed.
A microscope system comprises: a light source; an objective lens; a stage; a two-dimensional image sensor that captures an image of a specimen placed on the stage; a focusing device that changes distance between the objective lens and the stage; and a control circuit, wherein the control circuit executes, during a movement period in which the stage moves in a direction orthogonal to an optical axis of the objective lens, focus control for controlling the focusing device based on focus evaluation information detected during the movement period, and exposure control for controlling an exposure period of the two-dimensional image sensor, and executes light emission control that causes the light source to emit light with different light emission intensities during the exposure period and during a focus evaluation period in which the focus evaluation information is detected.
A system for evaluating stem cell differentiation includes a storage configured to store a machine learned model that has learned success or failure of cell differentiation for a combination of a stem cell and a differentiation induction method, an acquisition unit configured to acquire a target cell image that is an image of a target cell that is a stem cell to be induced to differentiate and differentiation induction information that is information related to a differentiation induction method applied to the target cell, and a processor configured to output differentiation success or failure information indicating an inference result related to success or failure of differentiation of the target cell into a desired cell type on the basis of the target cell image, the differentiation induction information, and the machine learned model. The differentiation induction information includes information indicating the type of a stimulus given to the target cell.
A control method for displaying inspection images causes a processor to display an image of a model of an inspection object including a plurality of inspection sites, in a first display region and display first inspection information relevant to at least one inspection site, in a second display region, where the at least one inspection site is selected from the plurality of inspection sites in the image of the model displayed in the first display region. The control method further causes the processor to display inspection information prior to the first inspection information, as second inspection information relevant to the model, in a third display region, concurrently with one or more of the first display region and the second display region.
An immersion objective includes a first lens group including a meniscus lens, a second lens group including a cemented lens and having a positive power, and a third lens group having a negative power. The third lens group is formed of a front group having a concave surface on a most image side and a back group having a concave surface on the most object side. Even when any of a plurality of immersion liquids used together with the immersion microscope objective is used, an amount of chromatic aberration at each of wavelengths in a range from 435.18 nm to 656.13 nm, which has an e-line as a reference, is smaller than a magnitude of depth of focus of the immersion microscope objective at the wavelength. The immersion microscope objective satisfies a following conditional expression.
An immersion objective includes a first lens group including a meniscus lens, a second lens group including a cemented lens and having a positive power, and a third lens group having a negative power. The third lens group is formed of a front group having a concave surface on a most image side and a back group having a concave surface on the most object side. Even when any of a plurality of immersion liquids used together with the immersion microscope objective is used, an amount of chromatic aberration at each of wavelengths in a range from 435.18 nm to 656.13 nm, which has an e-line as a reference, is smaller than a magnitude of depth of focus of the immersion microscope objective at the wavelength. The immersion microscope objective satisfies a following conditional expression.
0.64≤NA×WD≤3.5 (1)
In an inspection assistance method, a processor accepts inspection portion information related to an inspection portion. In the inspection assistance method, the processor acquires a result of an inspection of the inspection portion corresponding to the inspection portion information from a storage media: In the inspection assistance method, the processor displays inspection assistance information on a display. The inspection assistance information notifies a user of information related to an abnormality of the inspection portion detected in a previous inspection. In the inspection assistance method, the processor displays an image of the inspection portion on the display.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
A data management system includes: an input control unit (110) that adds measurement data uploaded by designating a theme that is a unit of access control and acquired by a measurement system, to a dataset classified based on metadata included in the measurement data, within the theme; a management control unit (120) that associates the dataset with a chapter that is a unit of status management and is provided within the theme; and a display control unit (130) that displays information managed by the data management system.
G06F 16/16 - File or folder operations, e.g. details of user interfaces specifically adapted to file systems
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
09 - Scientific and electric apparatus and instruments
37 - Construction and mining; installation and repair services
41 - Education, entertainment, sporting and cultural services
42 - Scientific, technological and industrial services, research and design
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Scientific laboratory apparatus and instruments; scientific
research apparatus and instruments; microscopes and parts
and fittings therefor; objective lenses for microscopes;
industrial endoscopes and parts and fittings therefor;
nondestructive testing instruments and parts and fittings
therefor; ultrasonic flaw detectors and parts and fittings
therefor; X-ray fluorescence analyzers and parts and
fittings therefor; microscopic software to control
microscopes, capture images, and process, measure and
analyze images captured by digital imaging devices for
industrial use; downloadable cloud computing software;
laboratory apparatus and instruments; photographic machines
and apparatus; cinematographic machines and apparatus;
optical machines and apparatus; measuring or testing
machines and instruments; electronic microscopes, apparatus
and their parts; scanning electron microscopes, apparatus
and their parts. Repair and maintenance of microscopes, industrial
endoscopes, nondestructive testing instruments, ultrasonic
flaw detectors, x-ray fluorescence analyzers; repair or
maintenance of cinematographic machines and apparatus;
repair or maintenance of optical machines and apparatus;
repair or maintenance of photographic machines and
apparatus; repair or maintenance of electronic machines and
apparatus; repair or maintenance of laboratory apparatus and
instruments; repair or maintenance of measuring and testing
machines and instruments. Arranging, conducting and organization of seminars on
microscopes, nondestructive testing instruments, measuring
apparatus; arranging, conducting and organization of
seminars. Cloud computing; electronic storage services for archiving
databases; software as a service [SaaS]; testing or research
on microscopes; image analyzing services; rental of
computers; providing computer programs on data networks;
testing, inspection or research of pharmaceuticals,
cosmetics or foodstuffs; research on building construction
or city planning; testing or research on prevention of
pollution; testing or research on electricity; testing or
research on civil engineering; testing, inspection or
research on agriculture, livestock breeding or fisheries;
testing or research on machines, apparatus and instruments. Medical image analyzing services for medical diagnosis;
providing medical information using microscope images;
providing medical information; physical examination; dietary
and nutritional guidance; animal breeding; veterinary
services; beautification for animals; massage and
therapeutic shiatsu massage; chiropractic; moxibustion;
treatment for dislocated joints, sprain, bone fractures or
the like [judo-seifuku]; bodywork therapy; acupuncture.
A fabrication support apparatus of a layered cell sheet in which a plurality of cell sheets are layered on one another includes circuitry. The circuitry is configured to acquire optical information that is information regarding optical properties of the layered cell sheet; to calculate a thickness distribution of the layered cell sheet based on the acquired optical information; to determine a layering position of a cell sheet to be newly layered on the layered cell sheet based on the calculated thickness distribution; and to output information of the determined layering position.
G05B 19/402 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for positioning, e.g. centring a tool relative to a hole in the workpiece, additional detection means to correct position
An insertion assistance system includes a processor. The processor is configured to set a first position and a second position in shape information. The processor is configured to estimate a first state of a distal end of an insertion unit at the first position. The processor is configured to calculate a path through which the distal end passes. The processor is configured to determine a second state of the distal end at the second position. The processor is configured to output, to an information-reporting device, insertion assistance information required for an insertion operation for causing the distal end to reach the first position from the second position through the path and causing a state of the distal end to change from the second state to the first state.
A61B 1/05 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
78.
THREE-DIMENSIONAL RECONSTRUCTION DEVICE, THREE-DIMENSIONAL RECONSTRUCTION METHOD, AND PROGRAM
This three-dimensional reconstruction device has an information acquisition unit, a size acquisition unit, a correction coefficient calculation unit, a camera position correction unit, and a three-dimensional shape restoration unit. The information acquisition unit acquires position/orientation information and two-dimensional coordinate information. The size acquisition unit acquires the three-dimensional size of a subject. The correction coefficient calculation unit calculates a correction coefficient for causing the size at the position of the camera to match a known size. The camera position correction unit uses the correction coefficient to correct the position of the camera. The three-dimensional shape restoration unit uses the corrected position, the orientation of the camera at the position, and the two-dimensional coordinate information to restore the three-dimensional shape of the subject.
An internal prediction method includes acquiring an image of a cell aggregate, calculating a feature amount related to a shape of the cell aggregate on the basis of the image, and outputting structure information related to an internal structure of the cell aggregate on the basis of the feature amount.
An inspection analysis method includes an analysis step, a prediction step, and an output step. A processor analyzes inspection information related to an operation in an inspection and generates first operation data and second operation data indicating a content of the operation in the analysis step. The first operation data indicate a content of the operation in a first inspection. The second operation data indicate a content of the operation in a second inspection performed after the first inspection is completed. The processor compares the first operation data and the second operation data with each other and predicts a state of an inspection target in the prediction step. The processor outputs state information indicating the state to a reporting device in the output step.
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
A61B 1/06 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor with illuminating arrangements
An observation apparatus includes a display apparatus that displays a display pattern, a display projection optical system that projects a light beam from the display apparatus, and forms an image of the display pattern, a combining optical element that combines a light beam from a sample and a light beam from the display apparatus, and an eyepiece optical system that enables an observer to simultaneously observe an image of the sample and an image of the display pattern, in which a numerical aperture (NA) of a light beam from the display apparatus is smaller than a maximum value of an NA of a light beam from the sample, and is larger than a minimum value of an NA of a light beam from the sample, at a position of an image on an optical path that is formed after light beams are combined by the combining optical element.
A surface estimation method includes a region-setting step and an estimation step. In the region-setting step, a reference region that is one of a three-dimensional region and a two-dimensional region is set. The three-dimensional region includes three or more points and is set in a three-dimensional space. The three-dimensional space includes three-dimensional coordinates of three or more points on a subject calculated on the basis of a two-dimensional image of the subject. The three-dimensional coordinates of the three or more points are included in three-dimensional image data. The two-dimensional region includes three or more points and is set in the two-dimensional image. In the estimation step, a reference surface that approximates a surface of the subject is estimated on the basis of three or more points of the three-dimensional image data corresponding to the three or more points included in the reference region.
G02B 23/24 - Instruments for viewing the inside of hollow bodies, e.g. fibrescopes
G01B 11/03 - Measuring arrangements characterised by the use of optical techniques for measuring length, width, or thickness by measuring coordinates of points
G01B 11/24 - Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
G01B 11/02 - Measuring arrangements characterised by the use of optical techniques for measuring length, width, or thickness
A microscope simulation device includes a processor. The processor is configured to acquire a plurality of pieces of component information each indicating a technical specification of a corresponding microscope component, simulate an assembly of a microscope system based on the acquired plurality of pieces of component information, and output a generated simulation result to a display device.
G06F 30/23 - Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
G02B 21/36 - Microscopes arranged for photographic purposes or projection purposes
G06F 30/12 - Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
85.
OBSERVATION DEVICE, REFLECTOR, AND PHASE OBJECT OBSERVATION METHOD
An observation device includes an illumination optical system provided on a lower side of an installation position of a multi-well plate, a reflector that reflects light emitted from the illumination optical system, the reflector being provided on an upper side of the installation position, and an observation optical system that condenses the light reflected by the reflector, the observation optical system being provided on the lower side of the installation position. The reflector includes a plurality of curved surfaces where the light emitted from the illumination optical system enters. Each of the plurality of curved surfaces corresponds to one or more wells included in the multi-well plate, has positive power in a first direction in which the illumination optical system and the observation optical system are aligned, and has a center of curvature at a position deviating from a central axis of a well of the multi-well plate.
37 - Construction and mining; installation and repair services
41 - Education, entertainment, sporting and cultural services
42 - Scientific, technological and industrial services, research and design
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Repair and maintenance of microscopes, industrial
endoscopes, nondestructive testing instruments, ultrasonic
flaw detectors, x-ray fluorescence analyzers; repair or
maintenance of cinematographic machines and apparatus;
repair or maintenance of optical machines and apparatus;
repair or maintenance of photographic machines and
apparatus; repair or maintenance of electronic machines and
apparatus; repair or maintenance of laboratory apparatus and
instruments; repair or maintenance of measuring and testing
machines and instruments. Arranging, conducting and organization of seminars on
microscopes, nondestructive testing instruments, measuring
apparatus; arranging, conducting and organization of
seminars. Cloud computing; electronic storage services for archiving
databases; software as a service [SaaS]; testing or research
on microscopes; image analyzing services; rental of
computers; providing computer programs on data networks;
testing, inspection or research of pharmaceuticals,
cosmetics or foodstuffs; research on building construction
or city planning; testing or research on prevention of
pollution; testing or research on electricity; testing or
research on civil engineering; testing, inspection or
research on agriculture, livestock breeding or fisheries;
testing or research on machines, apparatus and instruments. Medical image analyzing services for medical diagnosis;
providing medical information using microscope images;
providing medical information; physical examination; dietary
and nutritional guidance; animal breeding; veterinary
services; beautification for animals; massage and
therapeutic shiatsu massage; chiropractic; moxibustion;
treatment for dislocated joints, sprain, bone fractures or
the like [judo-seifuku]; bodywork therapy; acupuncture.
87.
Three-dimensional image display method, three-dimensional image display device, and recording medium
In a three-dimensional image display method, a processor acquires first three-dimensional data and second three-dimensional data of a subject from a recording medium. The processor converts a first three-dimensional coordinate system of the first three-dimensional data and a second three-dimensional coordinate system of the second three-dimensional data into a three-dimensional common coordinate system on the basis of structure information related to a geometric structure of the subject. The processor displays an image of the first three-dimensional data in the common coordinate system and an image of the second three-dimensional data in the common coordinate system on a display.
H04N 13/207 - Image signal generators using stereoscopic image cameras using a single 2D image sensor
H04N 13/361 - Reproducing mixed stereoscopic imagesReproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
H04N 13/111 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
H04N 13/167 - Synchronising or controlling image signals
88.
Illumination optical system for endoscope, optical adaptor and endoscope
An illumination optical system for endoscope includes a rod lens as an optical element. The optical element includes a proximal end surface through which light enters and a distal end surface configured to emit the light. The distal end surface includes a diffusion region configured to diffuse emitted light. The diffusion region includes a plurality of concave portions and a plurality of peripheral regions surrounding the concave portions. Each concave portion includes a plurality of inclined surfaces as total reflection surfaces that are inclined with respect to the distal end surface. Each peripheral region includes a transmission surface configured to emit light totally reflected by the total reflection surfaces after passing through the proximal end surface and light not totally reflected by the total reflection surfaces.
A61B 1/06 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor with illuminating arrangements
A61B 1/07 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopesIlluminating arrangements therefor
An observation device includes: a macro observation system; and a micro observation system. The macro observation system and the micro observation system are arranged so as to satisfy a first condition. The first condition is that a distance from a macro optical axis to a micro optical axis is equal to or less than a square root of a sum of squares of a first distance and a second distance. The first distance is a distance between the macro optical axis and a central axis of an outer diameter of the nosepiece. The second distance is a distance in a first direction between the central axis of the outer diameter and a side surface of the nosepiece. The first direction is a direction orthogonal to the macro optical axis and orthogonal to a line segment connecting the macro optical axis and the central axis of the outer diameter.
09 - Scientific and electric apparatus and instruments
37 - Construction and mining; installation and repair services
41 - Education, entertainment, sporting and cultural services
42 - Scientific, technological and industrial services, research and design
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Microscopes and parts and fittings therefor; objective lenses for microscopes; industrial endoscopes and parts and fittings therefor for non-medical use; nondestructive testing instruments and parts and fittings therefor, namely, flaw detectors, thickness gages, bond testers, transducers, eddy current probes, radiography systems and parts and fittings therefor for non-medical use; X-ray fluorescence analyzers and parts and fittings therefor; recorded software to control microscopes, capture images, and process, measure and analyze images captured by digital imaging devices for industrial use; downloadable cloud computing software to control microscopes, capture images, and process, measure and analyze images captured by digital imaging devices for industrial use; laboratory apparatus and instruments, namely, monitoring or measuring machines; optical machines and apparatus, namely, slide scanners; measuring or testing machines and instruments, namely, flaw detectors, bond testers, thickness gages, transducers, eddy current probes, industrial endoscopes, X-ray fluorescence analyzers, radiography systems and parts and fittings therefor Repair and maintenance of microscopes, industrial endoscopes, nondestructive testing instruments, ultrasonic flaw detectors, x-ray fluorescence analyzers; repair or maintenance of cinematographic machines and apparatus; repair or maintenance of optical machines and apparatus; repair or maintenance of photographic machines and apparatus; repair or maintenance of electronic machines and apparatus; repair or maintenance of laboratory apparatus and instruments; repair or maintenance of measuring and testing machines and instruments Arranging, conducting and organization of seminars in the field of microscopes, nondestructive testing instruments, and measuring apparatus Cloud computing featuring software for scientific testing, measurement, and analysis; electronic storage services for archiving databases; software as a service (SAAS) services featuring software for scientific testing, measurement and analysis; scientific testing and research with the use of microscopes, flaw detectors, bond testers, thickness gages, transducers, eddy current probes, industrial endoscopes, X-ray fluorescence analyzers and radiography systems; providing computer programs on data networks, namely, providing online non-downloadable software to control and store data for microscopes, flaw detectors, bond testers, thickness gages, transducers, eddy current probes, industrial endoscopes, X-ray fluorescence analyzers and radiography systems, and to capture images, and process, measure and analyze images captured by digital imaging devices; scientific testing, inspection and research Medical analysis services related to medical imaging; providing medical information using microscope images; medical practice, namely, providing medical services, providing medical information
91.
Image-processing method, image-processing device, and recording medium
An image-processing method includes a measurement step, an index calculation step, a comparison step, and a selection step. A processor measures a distance at each of two or more points in one or more images of a subject in the measurement step. The processor calculates a first index or a second index on the basis of the distance in the index calculation step. The processor compares the first index or the second index with a threshold value in the comparison step. The processor selects at least one image included in the one or more images in the selection step when the first index is greater than the threshold value or the second index is less than the threshold value.
09 - Scientific and electric apparatus and instruments
37 - Construction and mining; installation and repair services
41 - Education, entertainment, sporting and cultural services
42 - Scientific, technological and industrial services, research and design
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
(1) Microscopes and parts therefor; objective lenses for microscopes; industrial endoscopes and parts therefor; nondestructive testing instruments and parts therefor, namely, ultrasound flaw detectors not for medical purposes and parts therefor, eddy current flaw detectors and parts therefor, bond testing instruments and parts therefor, industrial x-ray instruments and parts therefor, industrial videoscopes and parts therefor, x-ray fluorescence analyzers and parts therefor; thickness gauges and parts therefor; ultrasonic flaw detectors and parts therefor; x-ray fluorescence analyzers and parts therefor; microscope software to control microscopes, capture images, and process, measure and analyze images captured by digital imaging devices for industrial use; Downloadable computer software for running cloud computing based applications; downloadable cloud computing software, namely, software for microscopes, namely, software to control microscopes, capture images and process, measure and analyze images; downloadable cloud computing software to control non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; downloadable cloud computing software to capture images and measurements from non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; downloadable cloud computing software to analyze images and measurements from non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; measuring and testing machines and instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes and parts therefor; electronic microscopes, and their parts; scanning electron microscopes, and their parts. (1) Repair and maintenance of microscopes, industrial endoscopes, nondestructive testing instruments, ultrasonic flaw detectors, x-ray fluorescence analyzers.
(2) Arranging, conducting and organization of seminars on microscopes, nondestructive testing instruments, measuring apparatus.
(3) Cloud computing provider for general storage of data; cloud computing providing software for database management; software as a service for microscopes, namely, software to control microscopes, capture images and process, measure and analyze images; software as a service to control non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; software as a service to capture images and measurements from non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; software as a service to analyze images and measurements from non-destructive testing instruments, namely, ultrasound flaw detectors not for medical purposes, eddy current flaw detectors, bond testing instruments, industrial x-ray instruments, x-ray fluorescence analyzers, industrial videoscopes; testing and research on microscopes; rental of computers; testing, inspection and research of pharmaceuticals, cosmetics and foodstuffs; research on building construction and city planning; testing and research on prevention of pollution; testing and research on electricity; testing and research on civil engineering; testing, inspection and research on agriculture, livestock breeding and fisheries.
(4) Medical image analyzing services for medical diagnosis; providing medical information using microscope images; physical examination; dietary and nutritional guidance; veterinary services; beautification for animals; massage and therapeutic shiatsu massage; chiropractic; moxibustion; bodywork therapy; acupuncture.
An inverted microscope apparatus includes an immersion objective, an electric stage that moves at least in a direction orthogonal to an optical axis of the immersion objective, and a removal mechanism that removes an immersion liquid adhering to a bottom surface of a container placed on the electric stage. The removal mechanism is configured to scan the bottom surface using movement of the electric stage.
An information processing apparatus include a processor configured to: control a display to display at least a partial image of a captured image generated by capturing an image of an observation target; generate field of view information by associating: display position information indicating a position of a display area corresponding to the displayed partial image; magnification information indicating a display magnification of the partial image; and time information indicating a display time of the partial image; record the field of view information in a memory; and extract a piece of the field of view information including the magnification information indicating at least a single kind of specific display magnification input to an input device from among pieces of the field of view information recorded in the memory; generate a field of view map image corresponding to the extracted piece of field of view information; and control the display to display the field of view map image.
09 - Scientific and electric apparatus and instruments
37 - Construction and mining; installation and repair services
41 - Education, entertainment, sporting and cultural services
42 - Scientific, technological and industrial services, research and design
44 - Medical, veterinary, hygienic and cosmetic services; agriculture, horticulture and forestry services
Goods & Services
Scientific laboratory apparatus and instruments; Scientific research apparatus and instruments; Microscopes and parts and fittings therefor; objective lenses for microscopes; industrial endoscopes and parts and fittings therefor; nondestructive testing instruments and parts and fittings therefor; X-ray fluorescence analyzers and parts and fittings therefor; Microscopic software to control microscopes, capture images, and process, measure and analyze images captured by digital imaging devices for industrial use; downloadable cloud computing software; laboratory apparatus and instruments; laboratory apparatus and instruments; photographic machines and apparatus; cinematographic machines and apparatus; optical machines and apparatus; measuring or testing machines and instruments; electronic microscopes and their parts; scanning electron microscopes and their parts. Repair and maintenance of microscopes, industrial endoscopes, nondestructive testing instruments, ultrasonic flaw detectors, x-ray fluorescence analyzers; Repair or maintenance of cinematographic machines and apparatus; Repair or maintenance of optical machines and apparatus; repair or maintenance of photographic machines and apparatus; repair or maintenance of electronic machines and apparatus; repair or maintenance of laboratory apparatus and instruments; repair or maintenance of measuring and testing machines and instruments. Arranging, conducting and organization of seminars on microscopes, nondestructive testing instruments, measuring apparatus; Arranging, conducting and organization of seminars. Cloud computing; electronic storage services for archiving databases; software as a service [SaaS]; testing or research on microscopes; image analyzing services; rental of computers; providing computer programs on data networks; testing, inspection or research of pharmaceuticals, cosmetics or foodstuffs; research on building construction or city planning; testing or research on prevention of pollution; testing or research on electricity; testing or research on civil engineering; testing, inspection or research on agriculture, livestock breeding or fisheries; testing or research on machines, apparatus and instruments. Medical image analyzing services for medical diagnosis; providing medical information using microscope images; providing medical information; physical examination; dietary and nutritional guidance; animal breeding; veterinary services; beautification for animals; massage and therapeutic shiatsu massage; chiropractic; moxibustion; treatment for dislocated joints, sprain, bone fractures or the like [judo-seifuku]; bodywork therapy; acupuncture.
An observation system includes: an observation device that includes an eyepiece lens and an objective and forms a real image of a sample on an optical path between the eyepiece lens and the objective; and an observation auxiliary device that is worn by a user and outputs auxiliary information to the user, the observation auxiliary device superimposing the auxiliary information on a virtual image of the sample to be observed by the user through the eyepiece lens on the basis of a relative position of the observation auxiliary device with respect to the observation device.
The image processing system includes a scanner, a pixelated photon detector (PPD), and at least one processor. The at least one processor displays a setting screen on a display unit. The setting screen is a screen for setting an identification range that is a range of gradation to be identified. The at least one processor displays a second image obtained by converting a first image on the display unit. The second image is an image obtained by converting the first image based on at least the identification range. The first image is generated based on an intensity signal of light detected by a PPD and a scanning position by the scanner.
A device for evaluating progression of differentiation from pluripotent stem cells to pigment-containing cells includes a first irradiation unit configured to irradiate cells in a vessel with light in a melanin absorption wavelength band, a first detection unit configured to detect photo-acoustic waves generated from the cells irradiated with the light, and a processor configured to output an evaluation result relevant to the progression of the differentiation of the cells to the pigment-containing cells, based on intensity of the detected photo-acoustic waves detected by the first detection unit.
The apparatus includes a processor that selects parameter-setting candidate information by referring to a database about a parameter setting history of image analysis that analyzes a cell image, and an output unit that outputs the selected parameter-setting candidate information. The database includes setting history information that is a collection of pieces of combination information indicating combinations of parameters having been set in previously-performed image analysis, and is a collection of pieces of combination information including a recognition parameter that specifies an object of image recognition and an analysis parameter that specifies what feature of the object of image recognition is focused on in performing image analysis. The processor, in response to selection of a first parameter as a recognition parameter, selects the parameter-setting candidate information based on combination information including the first parameter in the setting history information.
A system includes a control device including a processor. The processor is configured to record a log into a log file, the log regarding information on an event having occurred in the system such that an operation state of the system at a point in time of the occurrence of the event is allowed to be specified as any of a plurality of operation states and an operation state different from the plurality of operation states. Here, the plurality of operation states correspond to a plurality of pieces of sequence processing previously defined. The processor is configured to extract the log from the log file, based on information designating a particular operation state as the operation state of the system, in units of operation states of the system such that the log includes information on an event having occurred in the particular operation state.