Human Generated Data

Title

Untitled (three young men and two young women leaning against railing)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2859

Human Generated Data

Title

Untitled (three young men and two young women leaning against railing)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2859

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.8
Human 99.8
Person 99.6
Person 99.4
Person 99.1
Person 99
Suit 84.7
Coat 84.7
Clothing 84.7
Overcoat 84.7
Apparel 84.7
Outdoors 78.4
Nature 73.7
Field 73.6
Shorts 73.2
Face 68.9
Tennis Racket 68.4
Racket 68.4
Female 67.1
People 64.9
Countryside 64.5
Sport 63
Sports 63
Dress 62.2
Portrait 60.8
Photography 60.8
Photo 60.8
Tuxedo 58.6
Hurdle 56

Clarifai
created on 2023-10-26

people 99.8
group together 99.1
group 98.4
adult 97.8
man 96.6
many 96
woman 95
bench 94.9
competition 93.6
wear 92.8
five 92.3
child 91.8
recreation 91.3
three 91
monochrome 90.8
athlete 88.1
several 86.7
boxer 85.9
boy 84.8
sibling 82.3

Imagga
created on 2022-01-16

percussion instrument 86.4
musical instrument 85.5
marimba 67.5
vibraphone 29
man 24.9
people 21.2
silhouette 19
male 18.4
person 18.3
device 14.5
black 13.9
wind instrument 13.7
accordion 13.3
sunset 12.6
adult 12.3
keyboard instrument 12.3
sky 12.1
sitting 12
park 11.5
businessman 11.5
water 11.3
outdoors 11.2
sport 10.7
boy 10.4
portrait 10.3
business 10.3
lifestyle 10.1
stage 10
holding 9.9
together 9.6
couple 9.6
symbol 9.4
player 8.7
love 8.7
steel drum 8.6
men 8.6
friends 8.5
outdoor 8.4
old 8.4
leisure 8.3
competition 8.2
recreation 8.1
women 7.9
design 7.9
relaxation 7.5
field 7.5
event 7.4
training 7.4
laptop 7.3
platform 7.3
athlete 7.2
sun 7.2
bench 7.1
night 7.1
day 7.1

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

text 98.4
outdoor 93.6
person 89.4
posing 88.3
clothing 83.8
black and white 81.3
standing 77.3
man 73.2
net 25.1
male 15.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 99.7%
Sad 32.3%
Fear 28.6%
Calm 19.4%
Surprised 6.5%
Happy 5.5%
Angry 2.6%
Confused 2.6%
Disgusted 2.5%

AWS Rekognition

Age 18-26
Gender Female, 69.9%
Calm 73.8%
Happy 12%
Sad 4.1%
Surprised 4.1%
Fear 2.1%
Angry 1.7%
Confused 1.2%
Disgusted 1%

AWS Rekognition

Age 19-27
Gender Female, 54%
Happy 46.2%
Fear 16.8%
Surprised 15.8%
Calm 12.3%
Sad 5.9%
Disgusted 1.2%
Confused 1%
Angry 0.7%

AWS Rekognition

Age 23-31
Gender Female, 98.5%
Calm 92.1%
Sad 3.5%
Happy 1.9%
Angry 0.7%
Confused 0.7%
Surprised 0.5%
Disgusted 0.4%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

KODAK-SEELA
BU