Human Generated Data

Title

Untitled (five women on road)

Date

1944, copied later

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19503

Human Generated Data

Title

Untitled (five women on road)

People

Artist: Samuel Cooper, American active 1950s

Date

1944, copied later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19503

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Apparel 99.7
Clothing 99.7
Human 98.4
Person 98.4
Person 97.7
Person 97.4
Person 96.8
Person 94.6
Dress 92.3
Nature 91
Female 89.6
Outdoors 88.7
Shorts 88.5
Woman 77.8
People 70.5
Skirt 70.5
Sea 69
Ocean 69
Water 69
Sand 64.6
Hat 63.9
Portrait 62.9
Photography 62.9
Face 62.9
Photo 62.9
Coast 57.6
Beach 57.6
Shoreline 57.6
Costume 55.5

Clarifai
created on 2019-10-29

people 99.9
group 98.7
group together 98
many 97.8
adult 97.4
wear 96.1
child 95.9
man 95.3
woman 94.4
outfit 90.7
vehicle 90.6
several 85.3
leader 84
boy 84
offspring 83.7
administration 81.3
two 79.8
veil 79.6
transportation system 79.1
crowd 77.9

Imagga
created on 2019-10-29

picket fence 33.2
fence 27
silhouette 22.3
people 20.6
barrier 20
man 18.2
snow 17.5
water 16
sky 15.3
landscape 14.9
travel 14.8
male 14.2
beach 13.8
obstruction 13.4
weather 13.4
black 12.6
business 12.1
person 11.7
sunset 11.7
tourism 11.5
ocean 10.9
team 10.7
structure 10.7
mountain 10.7
crowd 10.6
sun 10.5
walking 10.4
summer 10.3
sea 10.2
world 9.8
group 9.7
art 9.6
couple 9.6
scene 9.5
outdoor 9.2
park 9.1
design 9
vacation 9
white 8.9
negative 8.8
businessman 8.8
ice 8.6
holiday 8.6
winter 8.5
teamwork 8.3
city 8.3
building 8.2
occupation 8.2
sport 7.8
cold 7.7
dawn 7.7
men 7.7
sand 7.7
dusk 7.6
leisure 7.5
future 7.4
waves 7.4
adult 7.3
active 7.2
activity 7.2
women 7.1
cool 7.1
window 7.1
job 7.1
working 7.1
work 7.1
together 7

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 98.2
dress 96.1
clothing 95.2
person 91.7
woman 89.6
posing 82.8
image 35.9
picture frame 6.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 53.4%
Calm 49%
Angry 46.8%
Confused 45.8%
Surprised 46.2%
Disgusted 45.1%
Happy 45.4%
Fear 45.3%
Sad 46.4%

AWS Rekognition

Age 45-63
Gender Male, 53.6%
Angry 45%
Confused 45%
Happy 45.1%
Disgusted 45%
Sad 45.3%
Fear 45%
Surprised 45.2%
Calm 54.3%

AWS Rekognition

Age 20-32
Gender Female, 51.9%
Angry 45%
Surprised 45%
Fear 45.1%
Calm 45.6%
Happy 45.1%
Disgusted 45%
Sad 54.1%
Confused 45%

AWS Rekognition

Age 25-39
Gender Male, 54.2%
Calm 45%
Disgusted 45%
Fear 45%
Sad 45%
Angry 45%
Confused 55%
Surprised 45%
Happy 45%

AWS Rekognition

Age 22-34
Gender Female, 53.1%
Confused 45.2%
Angry 45.5%
Fear 45.8%
Sad 52.9%
Disgusted 45.1%
Happy 45.3%
Calm 45.2%
Surprised 45%

Feature analysis

Amazon

Person 98.4%

Categories