Human Generated Data

Title

Untitled (three ladies sitting on folding chairs)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19509

Human Generated Data

Title

Untitled (three ladies sitting on folding chairs)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19509

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Apparel 99.9
Clothing 99.9
Human 99
Person 99
Person 98.8
Dress 97.8
Chair 97.4
Furniture 97.4
Person 96.4
Robe 95.2
Fashion 95.2
Wedding 94.7
Gown 94.1
Female 93.1
Leisure Activities 90.6
Dance Pose 90.6
Face 88.5
Bridegroom 88.5
Bride 86.7
Wedding Gown 86.7
Woman 79.5
Blossom 71
Flower 71
Plant 71
Indoors 68.8
Dance 68.3
Portrait 67.6
Photo 67.6
Photography 67.6
Girl 64.5
Coat 64
Suit 64
Overcoat 64
Performer 59.8
Table 59.1
Room 55.1

Clarifai
created on 2019-10-29

people 99.8
group 98.9
group together 98.6
man 97.3
adult 97.2
several 97.1
woman 96.1
four 95.6
wear 95.1
music 93.8
furniture 92.7
outfit 91.6
three 91
five 89.4
musician 85.4
veil 85.2
sit 84.1
two 82.8
facial expression 82.7
actress 81.5

Imagga
created on 2019-10-29

man 29.6
person 28.2
people 27.3
home 24.7
adult 24.5
male 20.8
room 18.6
couple 16.5
men 16.3
women 15.8
happy 15.7
portrait 15.5
happiness 14.9
teacher 14.6
dress 14.4
smile 14.2
professional 13.7
smiling 13
lifestyle 13
wedding 12.9
two 12.7
family 12.4
indoors 12.3
indoor 11.9
house 11.7
bride 11.6
holding 11.5
interior 11.5
nurse 11.2
love 11
worker 11
businessman 10.6
married 10.5
life 10.5
cheerful 9.7
business 9.7
educator 9.6
celebration 9.6
husband 9.5
domestic 9.5
chair 9.4
casual 9.3
traditional 9.1
leisure 9.1
attractive 9.1
handsome 8.9
new 8.9
color 8.9
work 8.7
groom 8.7
holiday 8.6
sitting 8.6
wife 8.5
performer 8.5
mother 8.5
black 8.4
child 8.3
playing 8.2
girls 8.2
guitar 8.2
suit 8.1
group 8.1
sexy 8
together 7.9
brunette 7.8
bouquet 7.7
party 7.7
pretty 7.7
youth 7.7
loving 7.6
elegance 7.6
fashion 7.5
musician 7.5
fun 7.5
office 7.4
active 7.4
20s 7.3
decoration 7.3
clothing 7.3
music 7.3
team 7.2
modern 7

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99.5
wedding dress 88.2
person 85.2
clothing 83
dress 82.5
woman 78.2
bride 75.6
wedding 58.3
furniture 55.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 44-62
Gender Male, 95.3%
Calm 96.4%
Fear 0.2%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Sad 2.9%
Surprised 0.1%

AWS Rekognition

Age 38-56
Gender Female, 51.7%
Sad 50.2%
Surprised 45%
Happy 45%
Confused 45.1%
Fear 45%
Disgusted 45%
Calm 47.6%
Angry 47.1%

AWS Rekognition

Age 47-65
Gender Female, 64.2%
Surprised 0.3%
Confused 0.1%
Fear 0%
Disgusted 0.1%
Sad 0.4%
Happy 0.3%
Calm 98.6%
Angry 0.1%

AWS Rekognition

Age 29-45
Gender Female, 54.8%
Angry 45.2%
Happy 45.8%
Sad 46.7%
Fear 45.1%
Calm 51.7%
Confused 45.2%
Surprised 45.2%
Disgusted 45%

Feature analysis

Amazon

Person 99%

Categories

Text analysis

Amazon

2
92
SAPETY 2 500A 92
't
500A
SAPETY

Google

KODAR SAFETY 2
KODAR
SAFETY
2