Human Generated Data

Title

Untitled (two women with nun posing in livingroom)

Date

c. 1933, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5837

Human Generated Data

Title

Untitled (two women with nun posing in livingroom)

People

Artist: Durette Studio, American 20th century

Date

c. 1933, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5837

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 94.7
Person 94.7
Indoors 93.8
Interior Design 93.8
Person 93.1
Person 93.1
Restaurant 92.3
Room 90.1
Cafe 86.8
Furniture 74.1
Living Room 70.1
Person 65.7
Meal 60.3
Food 60.3
Monitor 59.2
Display 59.2
Screen 59.2
Electronics 59.2
Cafeteria 58.1
Bedroom 57.8
Food Court 56.4

Clarifai
created on 2019-11-16

people 99.8
group 97.9
adult 97.7
group together 97.5
furniture 96.6
room 96.3
man 95.8
woman 94.9
music 93.9
vehicle 91
monochrome 91
two 90.7
indoors 90
one 89.4
wear 88.1
sit 87.1
street 86.6
chair 85.9
several 82.7
many 82.2

Imagga
created on 2019-11-16

salon 50.7
shop 39.7
barbershop 37.1
mercantile establishment 26.2
black 22.2
hairdresser 21.6
restaurant 17.7
place of business 17.5
window 16.8
man 16.1
building 15.5
interior 15
people 13.4
light 12.7
style 11.9
fashion 11.3
old 11.1
inside 11
adult 11
glass 10.9
dark 10.8
person 10.7
room 10.5
urban 10.5
wall 10.3
street 10.1
male 9.9
structure 9.8
equipment 9.8
elegance 9.2
house 9.2
modern 9.1
retro 9
design 9
indoors 8.8
establishment 8.7
architecture 8.6
vintage 8.4
sexy 8
art 7.8
ancient 7.8
chair 7.6
city 7.5
leisure 7.5
one 7.5
shadow 7.2
love 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99
indoor 98
person 93.3
black and white 90.7
living 89
room 84.4
clothing 83
music 72.4
table 71.6
man 51.6
furniture 24.9
cluttered 13.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Female, 50.3%
Surprised 45.3%
Disgusted 45.3%
Sad 47.8%
Angry 45.3%
Calm 49.6%
Happy 45.2%
Fear 46%
Confused 45.6%

AWS Rekognition

Age 13-25
Gender Female, 53.8%
Confused 45.9%
Calm 45.4%
Sad 46.6%
Surprised 45.1%
Happy 46.6%
Disgusted 45.2%
Fear 45.4%
Angry 49.8%

AWS Rekognition

Age 16-28
Gender Male, 53.7%
Angry 53.7%
Fear 45.1%
Disgusted 45.2%
Sad 45.3%
Confused 45.2%
Calm 45.3%
Happy 45%
Surprised 45.1%

AWS Rekognition

Age 37-55
Gender Male, 50.5%
Disgusted 49.5%
Sad 49.5%
Happy 49.5%
Surprised 49.6%
Calm 50%
Fear 49.5%
Confused 49.5%
Angry 49.8%

AWS Rekognition

Age 22-34
Gender Male, 50.5%
Angry 46.4%
Happy 45.1%
Disgusted 45.1%
Calm 52.4%
Fear 45.2%
Surprised 45.3%
Sad 45.5%
Confused 45.1%

Feature analysis

Amazon

Person 94.7%

Text analysis

Google

C.
C.