Human Generated Data

Title

Untitled (group of men clowning around with cigars and decorative ribbons on lapel)

Date

c. 1907

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3859

Human Generated Data

Title

Untitled (group of men clowning around with cigars and decorative ribbons on lapel)

People

Artist: Durette Studio, American 20th century

Date

c. 1907

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.4
Human 99.4
Person 98.5
Person 95.3
Person 94.5
Person 93.9
Sport 91
Sports 91
Croquet 88.2
Person 63.7
Person 61.7

Clarifai
created on 2019-06-01

people 100
group together 99.5
many 99.2
adult 98
several 96.7
man 94.1
group 93.5
wear 92.6
uniform 89
athlete 86.7
sports equipment 86.1
education 84.5
child 82.3
four 80.4
outfit 78.9
leader 78.5
woman 75.3
three 75.2
administration 73.8
five 71.3

Imagga
created on 2019-06-01

ballplayer 100
athlete 91.4
player 81.2
contestant 66.7
person 45.8
man 28.9
silhouette 25.7
beach 25.4
people 24.6
sunset 24.3
male 22.8
adult 18.2
sport 15.9
sea 15.6
exercise 15.4
boy 14.8
sky 14.7
run 14.5
outdoors 14.2
water 14
runner 14
men 13.7
lifestyle 13.7
sand 13.1
outdoor 13
fitness 12.7
ocean 12.5
walking 12.3
summer 12.2
couple 12.2
sports equipment 11.7
dark 11.7
active 11.7
black 11.4
sun 11.3
leisure 10.8
vacation 10.6
action 10.2
coast 9.9
human 9.7
health 9.7
together 9.6
body 9.6
walk 9.5
outside 9.4
evening 9.3
pose 9.1
fun 9
group 8.9
businessman 8.8
cricket equipment 8.8
running 8.6
dusk 8.6
business 8.5
portrait 8.4
teenager 8.2
world 8.2
dress 8.1
love 7.9
motion 7.7
mask 7.7
old 7.7
hand 7.6
equipment 7.5
shore 7.4
home plate 7.4
life 7.3
freedom 7.3
alone 7.3
recreation 7.2
team 7.2
romantic 7.1
women 7.1
happiness 7.1
travel 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

old 94.2
posing 85.4
black 79.3
white 75.4
person 60.8
vintage 55.8
clothing 53.9

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 51.1%
Angry 45.7%
Happy 46.1%
Confused 46.6%
Calm 48.3%
Disgusted 45.5%
Sad 46.3%
Surprised 46.5%

AWS Rekognition

Age 10-15
Gender Female, 52.5%
Happy 45.7%
Confused 46%
Angry 45.8%
Disgusted 45.6%
Surprised 45.9%
Sad 46.9%
Calm 49%

AWS Rekognition

Age 20-38
Gender Male, 51.1%
Disgusted 45.3%
Happy 45.2%
Sad 46.4%
Calm 52%
Angry 45.3%
Surprised 45.3%
Confused 45.5%

AWS Rekognition

Age 30-47
Gender Female, 54.2%
Happy 47.9%
Surprised 45.9%
Angry 45.7%
Confused 45.5%
Calm 48.5%
Sad 45.8%
Disgusted 45.7%

AWS Rekognition

Age 26-43
Gender Female, 52.9%
Confused 45.8%
Surprised 45.2%
Happy 45%
Angry 45.6%
Sad 52.4%
Disgusted 45.2%
Calm 45.8%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 95.8%
a vintage photo of a group of people posing for a picture 95.7%
a vintage photo of a group of people posing for a photo 95.3%