Human Generated Data

Title

Untitled (group of men clowning around with cigars and decorative ribbons on lapel)

Date

c. 1907

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3857

Human Generated Data

Title

Untitled (group of men clowning around with cigars and decorative ribbons on lapel)

People

Artist: Durette Studio, American 20th century

Date

c. 1907

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Nature 99.6
Person 99.1
Human 99.1
Outdoors 99
Person 98.9
Person 97.3
Person 97
Person 93.9
Person 93.5
Person 89.8
Snow 87
Clothing 84.4
Apparel 84.4
Weather 79.1
Winter 77.2
People 76.1
Person 75.3
Ice 71.7
Sand 61.7
Water 59
Sea 59
Ocean 59
Beach 59
Coast 59
Shoreline 59
Storm 56.5
Shorts 56.1
Person 44.1

Clarifai
created on 2019-06-01

people 99.9
group together 97.3
adult 96.7
group 96.2
many 96
man 95.6
wear 88
several 81.5
woman 79.9
leader 77.3
uniform 75.9
outfit 75.5
child 74.6
military 73.7
administration 71.1
music 71
retro 67.6
war 62.9
athlete 61.2
affection 60.2

Imagga
created on 2019-06-01

negative 76.7
film 60.5
photographic paper 40.8
graffito 31.1
decoration 29.2
grunge 27.3
photographic equipment 27.2
old 22.3
art 20.2
texture 19.5
antique 19
vintage 19
grungy 19
dirty 18.1
structure 17
pattern 16.4
retro 15.6
black 15
rough 14.6
frame 14.2
wall 14
paint 13.6
aged 13.6
space 13.2
design 12.9
border 12.7
material 12.5
cool 11.5
weathered 11.4
text 11.4
color 11.1
decorative 10.9
paper 10.8
dirt 10.5
detail 10.5
ancient 10.4
snow 10
water 10
textured 9.6
edge 9.6
mask 9.6
damaged 9.5
old fashioned 9.5
graphic 9.5
dress 9
memorial 9
cemetery 8.9
scratch 8.8
canvas 8.5
outdoor 8.4
outdoors 8.2
stone 8.2
river 8
building 7.9
noisy 7.9
noise 7.8
rock 7.8
people 7.8
movie 7.8
cold 7.7
collage 7.7
aging 7.7
screen 7.6
landscape 7.4
man 7.4
fountain 7.3
fence 7.2
history 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

old 96.6
posing 94
person 77.4
black 77.1
clothing 76.1
white 73.2
man 72
player 67.8
black and white 56.6
vintage 53.3
image 34

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 51.9%
Surprised 45.9%
Sad 47.8%
Happy 45.1%
Angry 45.6%
Disgusted 45.2%
Confused 45.6%
Calm 49.9%

AWS Rekognition

Age 19-36
Gender Female, 54.1%
Sad 45.9%
Confused 45.5%
Disgusted 45.9%
Surprised 45.4%
Angry 45.3%
Happy 47.3%
Calm 49.8%

AWS Rekognition

Age 45-63
Gender Female, 52.5%
Angry 45.5%
Surprised 45.5%
Calm 49.6%
Sad 47.7%
Confused 45.4%
Disgusted 45.3%
Happy 46.1%

AWS Rekognition

Age 14-25
Gender Male, 52.6%
Confused 45.5%
Happy 46%
Calm 50.7%
Disgusted 45.4%
Sad 46.8%
Angry 45.4%
Surprised 45.3%

AWS Rekognition

Age 20-38
Gender Female, 54.1%
Happy 45.6%
Surprised 46.4%
Calm 48.3%
Sad 47.9%
Disgusted 45.4%
Angry 45.6%
Confused 45.8%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 84.5%
a vintage photo of a group of people posing for a picture 84.4%
a vintage photo of a group of people posing for a photo 82.6%