Human Generated Data

Title

Untitled (three photographs: couple in doorway; at table; on porch)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6788

Human Generated Data

Title

Untitled (three photographs: couple in doorway; at table; on porch)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6788

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.8
Person 99.8
Person 99.6
Person 99.5
Person 99.4
Person 99
Advertisement 98.5
Collage 98.5
Poster 98.5
Apparel 98.2
Clothing 98.2
Person 91.4
Sleeve 85.7
Home Decor 77.8
Long Sleeve 63.5
Door 59.9
Shirt 58.7
Text 58.5
Label 58.5
Floor 58.4

Clarifai
created on 2019-11-16

people 99.9
group 99
adult 98.5
group together 97.4
man 96.7
woman 95.9
administration 94.8
several 94.8
four 92.2
three 91.9
wear 91.7
offense 90.6
leader 90.4
two 90.2
many 89.9
one 88.8
war 88.7
military 84.4
room 84.1
home 83.4

Imagga
created on 2019-11-16

sketch 87
drawing 66.2
representation 50.9
architecture 35.9
building 34.1
window 31.5
city 25.8
house 25.1
wall 24.8
old 22.3
urban 21
street 18.4
structure 17.7
travel 16.9
home 15.1
balcony 14.9
glass 14
town 13.9
door 13.5
windows 13.4
ancient 13
construction 12.8
lamp 12.8
tourism 11.5
business 11.5
architectural 11.5
sky 11.5
roof 11.4
stone 11
road 10.8
interior 10.6
shop 10.6
black 10.2
design 10.1
barbershop 10.1
light 10
empty 9.4
brick 9.4
historical 9.4
room 9.4
indoor 9.1
silhouette 9.1
history 8.9
office 8.9
people 8.9
high 8.7
concrete 8.6
modern 8.4
floor 8.4
exterior 8.3
inside 8.3
historic 8.2
dirty 8.1
wood 7.5
style 7.4
tower 7.2
newspaper 7.1
wooden 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 96.6
text 94.3
person 92
man 88
indoor 86.6
woman 76.5
white 64.3
old 47.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Male, 54.9%
Fear 45%
Angry 45.1%
Calm 53.6%
Surprised 45.1%
Happy 45.8%
Confused 45.2%
Sad 45.3%
Disgusted 45%

AWS Rekognition

Age 13-25
Gender Male, 54.9%
Surprised 45%
Fear 45%
Happy 55%
Sad 45%
Calm 45%
Disgusted 45%
Angry 45%
Confused 45%

AWS Rekognition

Age 27-43
Gender Male, 52.2%
Disgusted 45.1%
Sad 46.9%
Fear 45.1%
Angry 46.5%
Confused 45.3%
Happy 45.5%
Calm 50.4%
Surprised 45.2%

AWS Rekognition

Age 22-34
Gender Female, 50.2%
Surprised 49.5%
Disgusted 49.7%
Happy 49.7%
Sad 49.5%
Calm 49.7%
Confused 49.6%
Fear 49.5%
Angry 49.7%

AWS Rekognition

Age 5-15
Gender Female, 52.3%
Angry 46.8%
Sad 48.6%
Confused 45.3%
Surprised 45.2%
Happy 45.6%
Disgusted 45.1%
Calm 47.6%
Fear 45.9%

Feature analysis

Amazon

Person 99.8%