Human Generated Data

Title

Untitled (man and woman sitting with skeleton)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20270

Human Generated Data

Title

Untitled (man and woman sitting with skeleton)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Imagga
created on 2022-03-05

male 30.5
person 30
people 29.6
man 28.9
silhouette 27.3
blackboard 23.4
men 22.3
adult 19.4
brass 19.1
classroom 18.8
businessman 18.5
teacher 17.3
business 17
group 16.9
education 16.5
wind instrument 15.5
sport 15
event 13.9
class 13.5
crowd 13.4
hand 12.9
board 12.8
player 12.5
room 12.1
human 12
happy 11.9
women 11.9
school 11.8
musical instrument 11.6
symbol 11.4
black 11.4
student 11.3
drawing 11.2
audience 10.7
dance 10.5
boy 10.4
professional 10.1
device 10
team 9.9
teaching 9.7
employee 9.6
design 9.6
youth 9.4
athlete 9.3
study 9.3
lights 9.3
training 9.2
dark 9.2
competition 9.1
executive 9
world 9
office 8.9
night 8.9
success 8.8
chalkboard 8.8
stage 8.8
lifestyle 8.7
party 8.6
college 8.5
casual 8.5
child 8.5
back 8.4
teenager 8.2
music 8.2
job 8
vibrant 7.9
bright 7.9
couple 7.8
cheering 7.8
happiness 7.8
cornet 7.8
portrait 7.8
grunge 7.7
performance 7.7
boss 7.6
muscular 7.6
gesture 7.6
nation 7.6
sign 7.5
art 7.5
holding 7.4
girls 7.3
horn 7.2
suit 7.2
financial 7.1
university 7
flag 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.7
musical instrument 93.3
guitar 86.1
person 85.3
black and white 75.1
posing 45.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 95.8%
Surprised 36.2%
Calm 32.1%
Happy 16.3%
Angry 4%
Confused 3.9%
Disgusted 3.4%
Sad 2.5%
Fear 1.6%

AWS Rekognition

Age 39-47
Gender Male, 84.9%
Calm 74.2%
Sad 13.6%
Happy 4.9%
Fear 2.7%
Disgusted 2.2%
Surprised 1%
Angry 0.7%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.4%
Guitar 52.9%

Captions

Microsoft

text 61.5%

Text analysis

Amazon

&
PRESBREY
CECIL
J3
CECIL & PRESBREY INC
INC
A
ДЛА A ТОЛА
KODAK-EL
ТОЛА
ДЛА

Google

CECIL
YT37A2-
CECIL &PRESBREY YT37A2- AON
&PRESBREY
AON