Human Generated Data

Title

Endo Lab, Garden City

Date

1978

People

Artist: Per Brandin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.399

Copyright

© Per Brandin 1979

Human Generated Data

Title

Endo Lab, Garden City

People

Artist: Per Brandin, American born 1953

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.399

Copyright

© Per Brandin 1979

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clarifai
created on 2023-10-26

grinder 99.1
monochrome 99
people 98.8
industry 98.7
production 95.5
machine 94.6
stock 94.2
two 92.6
man 91.1
conveyer belt 91
adult 90.9
commerce 90.5
woman 88.2
box 88.2
room 85.4
one 84.8
shop 83.5
furniture 83.3
indoors 83.2
vehicle 82.1

Imagga
created on 2022-01-09

nurse 59.2
man 34.2
male 32.6
person 29.8
work 27.5
medical 27.3
people 26.8
room 26.3
professional 25.3
patient 25.1
home 24.7
doctor 24.4
adult 24.3
hospital 22.3
office 21
clinic 20.7
health 19.4
specialist 19.2
medicine 17.6
interior 16.8
occupation 16.5
business 16.4
computer 16
job 15.9
indoors 15.8
men 15.4
smiling 15.2
happy 15
working 15
uniform 14.8
worker 14.3
care 14
laptop 13.7
surgeon 12.6
modern 12.6
illness 12.4
case 12.3
women 11.9
house 11.7
portrait 11.6
couple 11.3
treatment 11
table 11
indoor 10.9
horizontal 10.9
team 10.7
science 10.7
bed 10.4
looking 10.4
technology 10.4
holding 9.9
businessman 9.7
research 9.5
equipment 9.5
corporate 9.4
sitting 9.4
senior 9.4
smile 9.3
student 9.1
lab 8.7
chair 8.7
laboratory 8.7
lifestyle 8.7
exam 8.6
profession 8.6
architecture 8.6
industry 8.5
coat 8.5
desk 8.4
floor 8.4
building 8.1
sick person 8
design 7.9
sick 7.7
drawing 7.7
test 7.7
two 7.6
reading 7.6
businesspeople 7.6
inside 7.4
businesswoman 7.3
to 7.1
happiness 7
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 66-76
Gender Female, 100%
Calm 54.1%
Disgusted 30.9%
Angry 5.2%
Confused 3.9%
Surprised 2.9%
Fear 1.6%
Sad 0.8%
Happy 0.5%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 99.3%
Sad 0.3%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Angry 81%
Calm 9.8%
Sad 5.6%
Surprised 1.1%
Confused 1%
Happy 0.7%
Fear 0.6%
Disgusted 0.3%

AWS Rekognition

Age 28-38
Gender Male, 100%
Angry 67.2%
Calm 24.5%
Happy 2.3%
Sad 2.1%
Surprised 1.7%
Confused 1%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 16-22
Gender Male, 100%
Calm 78.8%
Sad 12.4%
Happy 2.6%
Angry 2.5%
Fear 1.2%
Surprised 1%
Disgusted 0.8%
Confused 0.7%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 92.6%
Happy 4.7%
Sad 0.7%
Angry 0.7%
Surprised 0.5%
Confused 0.4%
Disgusted 0.4%
Fear 0.1%

Microsoft Cognitive Services

Age 55
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Tent 87%
Piano 59.5%

Categories

Text analysis

Amazon

Blakes
DOMGLAS
200
i
I
-
DOZ
Photograph Blakes
Blohes
4
IN
200 g
83
e
(Photog) Blohes
1998
.
Photograph 4 Blah
D
. SYTOWOO D
Photograph
Blah
SYTOWOO
(Photog)
of
Please
#2
1

Google

Phatma Blatss Gkatma Blakes DOMGLAS Phatma Blakes DOMGLASo 24A
Phatma
Blatss
Gkatma
Blakes
DOMGLAS
DOMGLASo
24A