Human Generated Data

Title

Untitled (copy of daguerreotype of man and woman with young boy)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13094

Human Generated Data

Title

Untitled (copy of daguerreotype of man and woman with young boy)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 99.3
Apparel 99.3
Person 98.5
Human 98.5
Person 97.6
Person 96.3
Advertisement 85.5
Poster 85.5
Accessories 78.1
Tie 78.1
Accessory 78.1
Shirt 77
Scientist 76.2
Face 76
Astronaut 66
Helmet 65.3
Coat 61.7
Door 61

Imagga
created on 2022-01-29

lab coat 54
coat 48.3
person 31.6
man 30.9
male 29.8
nurse 26.8
adult 24.7
people 24.5
doctor 22.5
portrait 21.3
garment 21.3
stethoscope 20.5
medical 20.3
clothing 20.3
health 18.7
professional 17.9
smile 17.1
hospital 16.9
happy 16.3
medicine 15.8
handsome 15.1
black 15
attractive 14.7
looking 14.4
one 14.2
standing 13.9
men 13.7
uniform 13.5
care 13.2
hair 12.7
guy 12.4
smiling 12.3
physician 11.7
businessman 11.5
face 11.4
friendly 11
business 10.9
posing 10.7
clinic 10.5
old 10.4
occupation 10.1
confident 10
job 9.7
lab 9.7
success 9.6
healthy 9.4
lifestyle 9.4
senior 9.4
casual 9.3
mature 9.3
hand 9.1
pose 9.1
fashion 9
human 9
office 8.8
shirt 8.8
body 8.8
health care 8.7
laboratory 8.7
work 8.6
staff 8.6
expression 8.5
jacket 8.4
student 8.1
dress 8.1
practitioner 7.9
happiness 7.8
antique 7.8
model 7.8
grunge 7.7
serious 7.6
vintage 7.6
covering 7.4
suit 7.4
instrument 7.3
sexy 7.2
art 7.2
science 7.1
surgeon 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.9
wall 95.7
drawing 94.7
sketch 93.9
posing 87.6
clothing 84.9
person 81.5
human face 81.2
old 75.3
smile 59.1
man 52.2

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 80.3%
Calm 96.5%
Confused 2.2%
Surprised 0.7%
Sad 0.2%
Angry 0.1%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 13-21
Gender Male, 99.8%
Calm 96.5%
Surprised 3.5%
Fear 0%
Disgusted 0%
Sad 0%
Angry 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 92.8%
Surprised 6.8%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Poster 85.5%
Tie 78.1%
Helmet 65.3%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 89.1%
a vintage photo of a group of people posing for a picture 89%
a vintage photo of a group of men posing for a picture 88.6%

Text analysis

Amazon

yeap
and