Human Generated Data

Title

Untitled (group of men sitting and standing on hill with device on tripod in center)

Date

c. 1905

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3871

Human Generated Data

Title

Untitled (group of men sitting and standing on hill with device on tripod in center)

People

Artist: Durette Studio, American 20th century

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3871

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.2
Human 98.2
Person 97.4
Person 94.2
Nature 94.2
Person 92.9
Person 92.9
Outdoors 92.6
Clothing 91.8
Apparel 91.8
People 91.7
Person 89.6
Person 88
Person 83.4
Person 79.6
Face 77.6
Sand 76.4
Person 69.5
Person 67.5
Crowd 59.7
Shorts 56.9
Person 49.3

Clarifai
created on 2019-06-01

winter 96.8
snow 96.5
desktop 94.1
cold 91.3
frost 89.5
people 89.4
ice 89.2
vintage 88.3
frozen 85.7
frosty weather 83.5
season 83
abstract 83
texture 82.8
design 82.4
retro 81.6
old 81.3
monochrome 79.1
woman 78.9
nature 78.7
Christmas 78.1

Imagga
created on 2019-06-01

negative 100
film 86.7
photographic paper 66.2
photographic equipment 44.1
sketch 32.6
drawing 25.4
grunge 23.8
texture 21.5
water 20
representation 19.6
cold 18.9
old 18.1
ice 17.3
snow 17.2
cool 16.9
vintage 16.5
aged 16.3
decoration 16.1
pattern 15.7
frozen 15.3
antique 14.7
glass 13.9
art 13.7
clear 13.1
color 12.8
drop 12.7
structure 12.6
frost 12.5
detail 12.1
natural 12
design 11.8
splash 11.6
wallpaper 11.5
textured 11.4
paint 10.9
space 10.9
frame 10.8
weather 10.6
grungy 10.4
paper 10.3
winter 10.2
floral 10.2
flower 10
dirty 9.9
material 9.8
backgrounds 9.7
outdoors 9.7
forest 9.6
motion 9.4
nobody 9.3
horizontal 9.2
close 9.1
effect 9.1
black 9
retro 9
transparent 9
surface 8.8
text 8.7
liquid 8.7
splashing 8.7
aging 8.6
canvas 8.5
decorative 8.3
border 8.1
fresh 7.8
icy 7.8
people 7.8
season 7.8
wave 7.8
crystal 7.6
rain 7.5
clean 7.5
flowing 7.5
style 7.4
environment 7.4
purity 7.4
speed 7.3
smooth 7.3
celebration 7.2
bright 7.1
river 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

old 87.2
posing 82.3
wedding dress 75.1
clothing 70.3
person 67.7
black and white 62.2
vintage 28.3
painting 16

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 53.6%
Confused 45.3%
Disgusted 45.5%
Calm 51.1%
Angry 45.3%
Sad 45.6%
Happy 46.7%
Surprised 45.4%

AWS Rekognition

Age 23-38
Gender Female, 53.2%
Confused 45.4%
Surprised 45.4%
Happy 45.3%
Calm 46.3%
Sad 50.7%
Disgusted 46.2%
Angry 45.7%

AWS Rekognition

Age 20-38
Gender Female, 52.7%
Happy 45.8%
Disgusted 45.2%
Angry 45.3%
Calm 51.7%
Sad 45.8%
Confused 45.7%
Surprised 45.6%

AWS Rekognition

Age 35-52
Gender Female, 54.7%
Sad 47.6%
Angry 45.3%
Happy 46.3%
Confused 45.1%
Calm 50.3%
Surprised 45.2%
Disgusted 45.2%

Feature analysis

Amazon

Person 98.2%

Categories

Imagga

paintings art 100%