Human Generated Data

Title

Untitled (laborotory with women in uniforms )

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5790

Human Generated Data

Title

Untitled (laborotory with women in uniforms )

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5790

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.7
Chair 99.7
Person 98.8
Human 98.8
Person 98
Person 97.2
Indoors 95.8
Room 95.8
Person 95.8
Chair 94.9
Person 93
Person 92.4
Dining Room 82.5
Interior Design 76.5
Architecture 73.6
Tower 73.6
Clock Tower 73.6
Building 73.6
Workshop 57.8
Conference Room 57.7
Meeting Room 57.7

Clarifai
created on 2019-11-16

people 99.5
monochrome 99.4
group together 98.2
street 96.7
bar 96.5
group 96.3
man 95
adult 94.1
restaurant 92
train 90.1
vehicle 88.5
many 87.2
crowd 87
city 86.3
indoors 85.6
room 85.3
grinder 85.3
furniture 84.7
stock 83
industry 82.5

Imagga
created on 2019-11-16

percussion instrument 53.5
musical instrument 43.9
restaurant 41
interior 38.9
steel drum 37.7
building 28.5
drum 27.7
chair 27
room 26.6
modern 24.5
furniture 24.1
table 23.5
house 20.9
kitchen 20.4
home 19.1
structure 18.4
counter 18
indoors 17.6
architecture 17.2
glass 17.1
shop 16.1
design 15.7
light 15.4
decor 15
luxury 14.6
apartment 14.4
indoor 13.7
comfortable 13.4
equipment 13.3
floor 13
office 12.9
inside 12.9
people 12.8
window 12.8
barbershop 11.8
elegance 11.7
decoration 11.6
business 11.5
lamp 11.4
contemporary 11.3
stove 11.2
lifestyle 10.8
city 10.8
urban 10.5
wood 10
steel 9.7
mercantile establishment 9.3
black 9
oven 8.9
chairs 8.8
dining 8.6
salon 8.1
cabinet 7.9
food 7.8
work 7.8
stainless 7.7
men 7.7
elegant 7.7
wall 7.7
expensive 7.7
living 7.6
dinner 7.6
barroom 7.4
style 7.4
service 7.4
bar 7.4
seat 7.1
life 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 93.6
floor 91.9
indoor 88.4
table 79.6
black and white 79.2
white 66
person 60.3
furniture 20.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 54.8%
Angry 45.1%
Disgusted 45%
Surprised 45%
Sad 52.4%
Confused 45.1%
Calm 46.7%
Fear 45.7%
Happy 45%

AWS Rekognition

Age 25-39
Gender Female, 54.4%
Angry 45.1%
Calm 51%
Surprised 45.2%
Confused 45.2%
Disgusted 45.1%
Happy 45.1%
Sad 48.2%
Fear 45.1%

AWS Rekognition

Age 31-47
Gender Female, 50.9%
Happy 45.1%
Confused 45.4%
Calm 46%
Angry 45.2%
Disgusted 45.1%
Surprised 45%
Fear 45.2%
Sad 53.1%

AWS Rekognition

Age 13-23
Gender Female, 50.4%
Happy 49.5%
Sad 49.7%
Disgusted 49.6%
Calm 49.5%
Angry 49.6%
Surprised 49.5%
Fear 49.6%
Confused 50%

AWS Rekognition

Age 11-21
Gender Female, 50.3%
Angry 49.5%
Calm 49.5%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 50.5%
Fear 49.5%

AWS Rekognition

Age 5-15
Gender Female, 50.1%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Fear 49.5%
Sad 49.8%
Surprised 49.6%
Confused 49.7%
Calm 49.8%

AWS Rekognition

Age 6-16
Gender Male, 54.2%
Happy 45%
Angry 45%
Confused 45%
Calm 45.1%
Disgusted 45%
Fear 45%
Surprised 45%
Sad 54.9%

Feature analysis

Amazon

Chair 99.7%
Person 98.8%
Clock Tower 73.6%

Categories

Text analysis

Google

NATm
NATm