Human Generated Data

Title

Untitled (woman in cabin at set table, ladder back chairs)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18651

Human Generated Data

Title

Untitled (woman in cabin at set table, ladder back chairs)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Apparel 97.7
Clothing 97.7
Footwear 97.1
Shoe 97.1
Chair 77.5
Furniture 77.5
Sleeve 75.5
Female 64.6
Shorts 61
Art 58
Flooring 56.2
Crypt 55.9
Long Sleeve 55.8
Chair 55.2

Imagga
created on 2022-03-05

weight 77.2
dumbbell 53.2
sports equipment 49.4
equipment 40.7
barbell 33.3
man 28.9
adult 25.4
male 22.7
training 22.2
gym 22
person 22
fitness 21.7
people 21.2
exercise 20
men 18
sexy 17.7
sport 17.3
model 15.6
body 15.2
black 15
strength 15
fashion 14.3
portrait 13.6
salon 13.5
muscular 13.4
attractive 13.3
musical instrument 13
active 12.6
workout 12.4
athlete 12.3
club 12.2
urban 12.2
face 12.1
fit 12
health 11.8
power 11.7
weights 11.7
city 11.6
lifestyle 11.6
indoors 11.4
device 11.4
healthy 11.3
human 11.2
pretty 11.2
women 11.1
muscles 10.8
building 10.4
brass 10.4
business 10.3
lifting 9.8
posing 9.8
wind instrument 9.6
style 9.6
happy 9.4
suit 9.4
handsome 8.9
professional 8.7
sitting 8.6
arm 8.5
legs 8.5
casual 8.5
two 8.5
strong 8.4
modern 8.4
elegance 8.4
occupation 8.2
indoor 8.2
one 8.2
office 8
job 8
clothing 7.9
bodybuilding 7.8
smile 7.8
standing 7.8
corporate 7.7
muscle 7.7
train 7.7
hairdresser 7.6
leisure 7.5
machine 7.4
holding 7.4
action 7.4
horn 7.3
smiling 7.2
dress 7.2
interior 7.1
businessman 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.5
drawing 96
cartoon 87.5
black and white 84.9
furniture 79.7
sketch 78.6
clothing 77.8
person 65.3
table 59.2

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99%
Calm 89.1%
Surprised 4.7%
Happy 3.1%
Fear 1.3%
Sad 0.6%
Disgusted 0.5%
Confused 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 97.1%
Chair 77.5%

Captions

Microsoft

a person sitting on a bench 37.3%
a person sitting on a bench 37.2%
a person is sitting on a bench 28.7%

Text analysis

Amazon

M2J17--YT37A°S--XOX

Google

-
YT3RA°2
MJI7--
MJI7-- YT3RA°2 - - XAGO
XAGO