Human Generated Data

Title

Untitled (two infants lying in crib)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6001

Human Generated Data

Title

Untitled (two infants lying in crib)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6001

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.1
Human 98.1
Person 96.8
Footwear 88.1
Apparel 88.1
Shoe 88.1
Clothing 88.1
Coat 73.8
Overcoat 73.8
Suit 73.8
Sleeve 72.7
Shoe 58.4
Person 56.6
Shoe 55.4

Clarifai
created on 2019-11-16

people 99.9
adult 98.5
man 96.2
portrait 96.1
wear 95.9
outfit 95
administration 94.8
vehicle 94.3
uniform 94.3
military 93.1
group 92.7
movie 92.4
war 91.7
child 90.4
two 90.1
actor 90
offense 89.4
leader 89.2
one 87.7
police 87.7

Imagga
created on 2019-11-16

television 41.5
telecommunication system 28.1
man 22.8
people 17.8
male 17.7
black 16.4
person 15.4
old 13.2
art 13.1
vintage 11.6
adult 11
one 10.4
antique 10.4
business 10.3
dark 10
retro 9.8
window 9.8
portrait 9.7
monitor 9.1
office 8.9
businessman 8.8
wall 8.5
room 8.5
grunge 8.5
frame 8.3
fashion 8.3
aged 8.1
dirty 8.1
symbol 8.1
design 8
light 8
film 7.8
ancient 7.8
human 7.5
suit 7.5
blackboard 7.4
protection 7.3
screen 7.3
mask 7.2
face 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.3
clothing 93.6
person 83.7
man 71.1
coat 62.8
black and white 55.6
posing 44.4
old 40.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Male, 54.4%
Angry 47.1%
Happy 45%
Disgusted 45.1%
Confused 50.1%
Surprised 45.1%
Calm 46.9%
Fear 45%
Sad 45.6%

AWS Rekognition

Age 33-49
Gender Male, 53%
Angry 49.6%
Sad 45.5%
Confused 45.1%
Calm 49.7%
Surprised 45.1%
Happy 45%
Disgusted 45%
Fear 45.1%

Feature analysis

Amazon

Person 98.1%
Shoe 88.1%
Suit 73.8%

Categories