Human Generated Data

Title

Untitled (group of men sitting and standing on hill with device on tripod in center)

Date

c. 1907

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3860

Human Generated Data

Title

Untitled (group of men sitting and standing on hill with device on tripod in center)

People

Artist: Durette Studio, American 20th century

Date

c. 1907

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3860

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.4
Human 98.4
Person 97.1
Person 96.8
Person 96.7
Face 95.3
Person 94.7
Outdoors 92.9
Nature 92.1
People 90.9
Person 89.9
Apparel 88.9
Clothing 88.9
Person 84.8
Person 80.3
Person 80
Person 78.1
Sand 76.1
Wedding 69.6
Land 68.2
Forest 68.2
Vegetation 68.2
Tree 68.2
Plant 68.2
Woodland 68.2
Gown 68.2
Fashion 68.2
Robe 65.8
Grass 65.4
Photo 64.6
Photography 64.6
Portrait 64.6
Crowd 63.7
Female 62.2
Dress 61.2
Overcoat 59.1
Coat 59.1
Suit 59.1
Wedding Gown 57.6
Meal 57.4
Picnic 57.4
Leisure Activities 57.4
Vacation 57.4
Food 57.4
Smile 57.1
Bridegroom 56.1

Clarifai
created on 2019-06-01

people 98.4
snow 95.7
winter 93.8
man 92.5
adult 91.3
woman 89.2
veil 87.7
vintage 86.3
wedding 83.5
desktop 82
old 80.9
wear 80.7
child 80.6
cold 79.6
retro 77.8
antique 77.2
ice 75.8
sepia 75.7
monochrome 75.1
group 72

Imagga
created on 2019-06-01

negative 100
film 100
photographic paper 77.8
photographic equipment 51.9
grunge 31.5
sketch 26.7
texture 25
drawing 23
old 20.9
aged 20.8
pattern 19.1
vintage 19
antique 18.2
decoration 17.5
art 16.3
cool 16
snow 15.6
water 15.3
representation 15.3
cold 14.6
wallpaper 14.5
material 14.3
textured 14
detail 13.7
floral 13.6
frame 13.3
space 13.2
design 12.9
color 12.8
structure 12.7
paint 12.7
dirty 12.6
aging 12.5
paper 12.1
effect 11.9
ice 11.7
flower 11.5
glass 10.9
dress 10.8
retro 10.6
frost 10.6
backgrounds 10.5
clear 10.5
grungy 10.4
wall 10.3
winter 10.2
grain 10.1
surface 9.7
obsolete 9.6
frozen 9.6
ancient 9.5
canvas 9.5
natural 9.4
horizontal 9.2
decorative 9.2
border 9
graffito 8.8
crystal 8.8
fracture 8.8
graphic 8.8
weather 8.7
text 8.7
crack 8.7
artistic 8.7
decay 8.7
detailed 8.7
stain 8.6
bride 8.6
close 8.6
nobody 8.6
element 8.3
outdoors 8.2
style 8.2
romantic 8
bright 7.9
flowers 7.8
grime 7.8
black 7.8
mottled 7.8
season 7.8
ornament 7.8
edge 7.7
backdrop 7.4
rough 7.3
fantasy 7.2
celebration 7.2
transparent 7.2
leaf 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

posing 94.5
old 88.4
black and white 77.1
wedding dress 74.1
snow 67.9
wedding 52.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-44
Gender Male, 51.2%
Angry 45.3%
Happy 45.4%
Calm 49.9%
Surprised 46.7%
Sad 46.2%
Confused 46.3%
Disgusted 45.2%

AWS Rekognition

Age 29-45
Gender Female, 52.4%
Surprised 45.7%
Happy 46.4%
Disgusted 45.9%
Calm 47.5%
Sad 47.9%
Confused 45.5%
Angry 46.1%

AWS Rekognition

Age 48-68
Gender Male, 51.4%
Disgusted 45.6%
Calm 47.9%
Angry 47.1%
Sad 47.4%
Happy 45.8%
Surprised 45.7%
Confused 45.6%

AWS Rekognition

Age 14-23
Gender Male, 51.1%
Sad 45.9%
Angry 46.1%
Calm 50.7%
Confused 45.7%
Disgusted 45.2%
Happy 46.1%
Surprised 45.3%

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Angry 46%
Disgusted 46.3%
Confused 46.3%
Calm 46.2%
Sad 47.6%
Surprised 46%
Happy 46.5%

Feature analysis

Amazon

Person 98.4%

Categories

Imagga

paintings art 100%