Human Generated Data

Title

Untitled (view looking up at seven women and men sitting on rock outcrop)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3688

Human Generated Data

Title

Untitled (view looking up at seven women and men sitting on rock outcrop)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3688

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98.5
Person 98.5
Person 98.4
Person 98.2
Person 94.8
Person 93.8
Clothing 93.5
Helmet 93.5
Apparel 93.5
Person 89.7
Art 87.5
Drawing 87.5
Face 83.4
Nature 82.9
People 79.6
Outdoors 75.7
Photo 63.4
Photography 63.4
Portrait 63.4
Person 60.5
Urban 59.6
Crowd 55.8

Clarifai
created on 2019-06-01

people 99.8
adult 99
man 98.4
group 97.6
group together 93
action 92.8
wear 92.7
woman 92
print 90.7
vehicle 90.4
art 90.3
illustration 86.8
two 85.4
recreation 84.5
child 82.9
motion 81.9
interaction 77.8
skirmish 77.2
monochrome 77.2
war 76.8

Imagga
created on 2019-06-01

chemical weapon 43.1
weapon of mass destruction 37.2
weapon 34.4
groom 26
fountain 23.6
summer 23.2
sky 23
instrument 21.6
water 21.4
man 20.8
outdoor 20.7
people 20.1
person 18.1
sea 15.6
landscape 15.6
sun 14.5
sunset 14.4
adult 14.3
beach 14.3
happiness 14.1
device 13.8
ocean 13.6
park 13.2
outdoors 13
natural 12.7
travel 12.7
structure 12.3
smoke 12.1
danger 11.8
power 11.8
sunlight 11.6
vacation 11.5
scenic 11.4
snow 11.4
sport 11.2
heat 11.1
grass 11.1
clouds 11
male 10.6
weather 10.5
adventure 10.4
rock 10.4
two 10.2
hot 10.1
city 10
river 9.8
sand 9.7
light 9.6
bikini 9.6
love 9.5
joy 9.2
silhouette 9.1
active 9
coast 9
cool 8.9
sexy 8.8
happy 8.8
women 8.7
standing 8.7
lifestyle 8.7
day 8.6
cloud 8.6
holiday 8.6
season 8.6
field 8.4
leisure 8.3
fun 8.2
freedom 8.2
one 8.2
activity 8.1
mountain 8
trees 8
couple 7.8
men 7.7
bride 7.7
winter 7.7
wilderness 7.6
serene 7.5
enjoy 7.5
sunrise 7.5
environment 7.4
flow 7.4
clothing 7.4
building 7.2
dirty 7.2
dress 7.2
black 7.2
recreation 7.2
wet 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 97
person 82.9
waterfall 77.8
drawing 64.7
painting 62.5
old 60.2
sketch 55.5
posing 43.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Male, 54.4%
Happy 45.7%
Disgusted 45.8%
Angry 45.5%
Surprised 45.5%
Sad 45.9%
Calm 51.1%
Confused 45.6%

AWS Rekognition

Age 26-43
Gender Male, 51.1%
Happy 46%
Disgusted 45.2%
Angry 45.5%
Surprised 45.4%
Sad 46.7%
Calm 50.7%
Confused 45.5%

AWS Rekognition

Age 35-52
Gender Female, 53.8%
Angry 46.1%
Sad 45.6%
Happy 47.5%
Calm 45.7%
Confused 45.4%
Disgusted 49.2%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Female, 50.8%
Angry 45.3%
Sad 45.8%
Disgusted 45.2%
Surprised 45.2%
Happy 45.3%
Calm 53.1%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Female, 53.1%
Disgusted 45.2%
Sad 50.8%
Happy 45.2%
Surprised 45.5%
Calm 47.5%
Angry 45.6%
Confused 45.3%

AWS Rekognition

Age 20-38
Gender Male, 53.1%
Angry 48.5%
Calm 45.6%
Sad 45.4%
Surprised 45.3%
Disgusted 48.6%
Happy 45.7%
Confused 45.9%

Feature analysis

Amazon

Person 98.5%
Helmet 93.5%

Categories

Imagga

paintings art 96.5%
nature landscape 1.1%