Human Generated Data

Title

Untitled (portrait of group in living room wearing costumes)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4268

Human Generated Data

Title

Untitled (portrait of group in living room wearing costumes)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4268

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.4
Human 99.4
Person 98.9
Person 98.8
Person 97.9
Person 96.6
Person 96.3
Person 95
Person 93
Person 92.6
Person 89.2
Person 88.5
Leisure Activities 85
Person 77.3
Musical Instrument 73.1
Guitar 73.1
People 72.1
Apparel 68.3
Clothing 68.3
Person 63.8
Musician 58.8
Helmet 57.3
Crowd 56.2
Female 55.2
Girl 55.2

Clarifai
created on 2019-06-01

illustration 97.5
people 96.5
network 95.6
technology 94.3
desktop 93.7
spherical 93.5
internet 93.1
man 92.9
computer 92.4
communication 92.2
graphic 92.2
image 91.9
group 91.6
child 89.4
ball-shaped 88.3
woman 87.8
business 87.7
touch 87.5
World Wide Web 87.5
technical school 86

Imagga
created on 2019-06-01

negative 27.1
silhouette 24.8
film 23.9
drawing 22.3
people 22.3
art 21.7
design 18
black 17.4
photographic paper 17
sketch 16.4
kin 16.1
grunge 14.5
paint 13.6
graphic 13.1
team 12.5
pattern 12.3
symbol 12.1
man 12.1
photographic equipment 11.3
group 11.3
women 11.1
decoration 10.9
gymnasium 10.9
mother 10.5
shape 10.4
men 10.3
reflection 10.3
person 10.3
element 9.9
representation 9.9
cartoon 9.8
human 9.7
world 9.7
ink 9.6
light 9.3
clip art 9.3
house 9.2
vintage 9.1
retro 9
game 8.9
style 8.9
figure 8.6
frame 8.5
male 8.5
modern 8.4
health 8.3
brother 8.3
sport 8.2
dirty 8.1
painting 8.1
athletic facility 8
life 7.9
business 7.9
urban 7.9
happiness 7.8
boy 7.8
architecture 7.8
play 7.8
party 7.7
elegance 7.5
fun 7.5
color 7.2
star 7.2
family 7.1
love 7.1
working 7.1
work 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 98
window 96.6
posing 95.6
clothing 90.5
person 90.1
old 89.5
group 79.5
human face 65.8
smile 64.3
vintage 25.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-55
Gender Male, 53.4%
Happy 45.3%
Surprised 45.6%
Angry 45.4%
Confused 45.6%
Calm 51.8%
Sad 46%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.3%
Happy 52.8%
Angry 45.7%
Surprised 45.3%
Disgusted 45.1%
Sad 45.7%
Calm 45.3%
Confused 45.2%

AWS Rekognition

Age 19-36
Gender Female, 51.6%
Confused 45.3%
Disgusted 45.2%
Calm 51.8%
Angry 45.2%
Sad 46.8%
Happy 45.3%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.8%
Happy 51.1%
Surprised 45.6%
Calm 45.4%
Sad 46.8%
Disgusted 45.2%
Angry 45.4%
Confused 45.4%

AWS Rekognition

Age 16-27
Gender Female, 50.8%
Sad 45.6%
Happy 45.7%
Confused 45.4%
Angry 45.4%
Surprised 45.4%
Disgusted 45.3%
Calm 52.3%

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Disgusted 45.5%
Happy 46.7%
Sad 46.8%
Calm 49.5%
Angry 45.5%
Surprised 45.5%
Confused 45.5%

AWS Rekognition

Age 27-44
Gender Male, 53.5%
Confused 45.2%
Surprised 45.4%
Happy 48.6%
Angry 45.8%
Sad 45.4%
Disgusted 46.2%
Calm 48.4%

Feature analysis

Amazon

Person 99.4%
Helmet 57.3%

Categories

Imagga

paintings art 99.5%