Human Generated Data

Title

Untitled (laborotory with women in uniforms )

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5789

Human Generated Data

Title

Untitled (laborotory with women in uniforms )

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5789

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 96.5
Human 96.5
Person 95.7
Person 93.7
Room 93.7
Indoors 93.7
Person 88.6
Interior Design 87.7
Person 81.6
Person 80.4
Person 78.3
Workshop 76.8
Lab 76.7
Dressing Room 75.1
Furniture 70.9
Pub 59.3

Clarifai
created on 2019-11-16

people 99.5
room 98.9
furniture 98.7
group 97.3
indoors 95.8
bar 95.4
group together 95
chair 94.5
many 94.1
man 93.8
seat 92.5
adult 91.2
restaurant 89.8
table 89
monochrome 88.8
industry 88.8
no person 86.2
vehicle 85.6
military 85.2
stock 83.9

Imagga
created on 2019-11-16

equipment 37.8
case 34.7
interior 32.7
modern 20.3
furniture 20.2
room 19.3
electronic equipment 18.5
musical instrument 18.2
house 17.5
design 17.4
architecture 17.2
sequencer 16
electronic instrument 16
home 15.9
restaurant 15.8
light 15.4
chair 15.1
table 14.8
indoor 14.6
building 13.8
kitchen 13.7
business 13.3
apparatus 13.2
inside 12.9
device 12.6
steel 12.4
industry 11.9
lamp 11.5
urban 11.3
floor 11.1
luxury 11.1
work 11
wood 10.8
night 10.6
empty 10.3
office 10.1
metal 9.6
technology 9.6
black 9.6
contemporary 9.4
synthesizer 9
cabinet 9
new 8.9
amplifier 8.7
dining 8.6
glass 8.6
club 8.5
city 8.3
entertainment 8.3
music 8.3
server 8.2
digital 8.1
computer 8
indoors 7.9
stove 7.9
structure 7.8
wall 7.7
apartment 7.7
elegance 7.5
lights 7.4
bar 7.4
window 7.3
machine 7.3
people 7.2
stylish 7.2
hall 7.2
shop 7.1
decor 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97.2
furniture 95.4
black and white 92.8
indoor 92.2
table 85.4
street 80
shelf 79.1
bottle 78.1
cabinetry 77.7
white 67.1
house 66.5
black 65
bar 64.7
monochrome 56.2
restaurant 55.2
store 55

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-32
Gender Female, 51.7%
Angry 45%
Happy 45%
Fear 45%
Disgusted 45%
Sad 45.2%
Calm 54.6%
Surprised 45.1%
Confused 45%

AWS Rekognition

Age 22-34
Gender Female, 52.1%
Happy 45%
Angry 45.3%
Disgusted 45%
Confused 45.1%
Sad 50.5%
Calm 49%
Surprised 45%
Fear 45%

AWS Rekognition

Age 31-47
Gender Male, 51.1%
Happy 46.8%
Surprised 45.1%
Fear 45.5%
Calm 50.2%
Angry 45.6%
Sad 46.6%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 22-34
Gender Male, 50.5%
Calm 50.1%
Surprised 49.7%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Confused 49.6%
Sad 49.5%
Fear 49.5%

AWS Rekognition

Age 15-27
Gender Male, 50.3%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 50.5%
Angry 49.5%
Fear 49.5%
Disgusted 49.5%
Calm 49.5%

AWS Rekognition

Age 17-29
Gender Female, 50.4%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Fear 49.5%
Calm 50%
Surprised 49.5%
Sad 50%
Confused 49.5%

AWS Rekognition

Age 26-42
Gender Female, 52.1%
Angry 45.1%
Disgusted 45%
Happy 46.1%
Calm 45.2%
Surprised 45%
Fear 45.1%
Confused 45.1%
Sad 53.4%

Feature analysis

Amazon

Person 96.5%

Categories