Human Generated Data

Title

Untitled (Dr. Herman M. Juergens, nun and nurses in hospital)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.478

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens, nun and nurses in hospital)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.478

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Art 100
Collage 100
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 98.1
Person 98
Person 97.9
Person 97.7
Person 96.8
Person 96.7
Adult 82.4
Person 82.4
Bride 82.4
Female 82.4
Wedding 82.4
Woman 82.4
Person 79
Architecture 78.2
Building 78.2
Hospital 78.2
Head 71.4
Aircraft 70.9
Airplane 70.9
Transportation 70.9
Vehicle 70.9
Person 61.5
Indoors 57.1
Chart 56.8
Plot 56.8
Lighting 56.8
Body Part 55.1
Face 55.1
Neck 55.1

Clarifai
created on 2018-10-06

room 97.8
people 97.2
furniture 96.4
man 94
woman 93.9
desk 93.8
chair 93.3
adult 92.9
indoors 91.7
table 90.9
inside 89.6
desktop 88.6
hospital 87.5
sit 87
monochrome 83.5
family 83.3
no person 81.7
office 81.6
seat 81
illustration 80.6

Imagga
created on 2018-10-06

device 24.2
equipment 20.1
kitchen appliance 19.5
machine 19.2
home appliance 19
computer 18.6
appliance 17.1
technology 17.1
office 16.9
monitor 16.8
metal 15.3
keyboard 15.2
work 14.1
room 14
business 14
instrument 12.7
silver 12.4
steel 12.4
home 12.1
kitchen 12.1
stove 11.7
espresso maker 11.5
interior 11.5
black 11.4
table 10.7
laboratory 10.6
medical 10.6
modern 10.5
screen 10.3
close 10.3
object 10.3
desk 9.8
lens 9.7
water faucet 9.5
research 9.5
light 9.4
glass 9.3
3d 9.3
coffee 9.3
coffee maker 9.2
drink 9.2
old 9.1
toaster 9.1
medicine 8.8
desktop 8.6
industry 8.5
biology 8.5
house 8.4
tool 8.3
film 8.2
stapler 8
science 8
working 8
microscope 7.9
scientific 7.7
chemistry 7.7
sink 7.6
system 7.6
furniture 7.5
electronic 7.5
laptop 7.3
cup 7.2

Google
created on 2018-10-06

Microsoft
created on 2018-10-06

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 98.8%
Calm 96.2%
Surprised 6.6%
Fear 5.9%
Confused 2.6%
Sad 2.2%
Happy 0.3%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 11-19
Gender Male, 94.6%
Fear 71.8%
Calm 22.7%
Happy 9.7%
Surprised 6.6%
Angry 5.3%
Sad 3.4%
Disgusted 2%
Confused 1.3%

AWS Rekognition

Age 26-36
Gender Male, 57.5%
Calm 64.9%
Sad 55.7%
Surprised 6.9%
Fear 6.1%
Confused 2.2%
Happy 1.2%
Disgusted 1%
Angry 0.5%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Sad 99.3%
Happy 20.3%
Calm 13.2%
Surprised 6.6%
Fear 6.3%
Angry 1.4%
Confused 0.6%
Disgusted 0.6%

AWS Rekognition

Age 23-33
Gender Male, 94.3%
Fear 97%
Surprised 6.8%
Calm 3.4%
Sad 2.3%
Confused 0.5%
Disgusted 0.5%
Angry 0.4%
Happy 0.4%

Feature analysis

Amazon

Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Bride 82.4%
Female 82.4%
Woman 82.4%
Airplane 70.9%

Categories

Imagga

interior objects 99.4%
food drinks 0.5%
pets animals 0.1%

Captions

Text analysis

Amazon

18
KODA
KODAK
PAN
FILM
20
19
19A
TRI
16A
18A
KODAK TRI X PAN FILM
S'AI
X
S'AI LITY
15A
-15
2.16
217
LITY
17A
MA

Google

→ 18
18