Human Generated Data

Title

Untitled (boy dressed in white, seated, holding bible and rosary on lap)

Date

c. 1950

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13346

Human Generated Data

Title

Untitled (boy dressed in white, seated, holding bible and rosary on lap)

People

Artist: John Deusing, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.8
Person 98.7
Shoe 94.5
Footwear 94.5
Clothing 94.5
Apparel 94.5
Chef 65.3
Sitting 64.2
Furniture 57.1
Flooring 57

Imagga
created on 2022-01-23

man 41
person 35.4
male 33.4
musical instrument 32.1
people 27.3
adult 23.6
concertina 23.3
wind instrument 21.8
device 19.9
black 19.9
free-reed instrument 19.3
professional 17.9
portrait 17.5
men 17.2
happy 16.9
businessman 15.9
business 15.2
worker 15.1
guitar 14.4
job 14.2
home 13.6
washboard 13.5
handsome 13.4
holding 13.2
indoors 13.2
stringed instrument 13.1
smile 12.8
fashion 12.8
grandfather 12.5
smiling 12.3
bow tie 12.3
waiter 12.2
senior 12.2
music 11.8
necktie 11.7
studio 11.4
office 11.2
instrument 11
lifestyle 10.8
performer 10.6
guy 10.3
sitting 10.3
accordion 10.1
book 10.1
coat 10
face 9.9
modern 9.8
dining-room attendant 9.8
musician 9.8
medical 9.7
sexy 9.6
hair 9.5
work 9.5
shirt 9.3
suit 9.2
occupation 9.2
friendly 9.1
attractive 9.1
chair 9.1
employee 9
human 9
computer 8.9
looking 8.8
boy 8.7
profession 8.6
corporate 8.6
youth 8.5
pretty 8.4
mature 8.4
laptop 8.3
retro 8.2
keyboard instrument 8.1
success 8
working 8
model 7.8
play 7.8
elegant 7.7
patient 7.7
expression 7.7
serious 7.6
thinking 7.6
tie 7.6
style 7.4
garment 7.4
entertainment 7.4
lady 7.3
group 7.3
happiness 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

wall 96
indoor 93.8
floor 93.3
person 91.3
text 82.5
clothing 71.9
furniture 64.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Male, 100%
Calm 98.1%
Sad 1.2%
Confused 0.3%
Angry 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 94.5%

Captions

Microsoft

a man sitting on a table 82.8%
a man sitting at a table 82.7%
a man sitting on top of a table 79.6%

Text analysis

Amazon

cl