Human Generated Data

Title

Untitled (children at table with globe)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16725

Human Generated Data

Title

Untitled (children at table with globe)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Chair 99.2
Furniture 99.2
Apparel 98.8
Clothing 98.8
Person 93.9
Person 93
Table 81
Dining Table 81
Helmet 73.2
People 68.4
Coat 64.5
Overcoat 64.5
Suit 64.5
Face 63
Urban 62.3
Hat 62
Bonnet 56.4
Photography 55.3
Photo 55.3

Imagga
created on 2022-02-26

people 21.2
man 19.5
male 17
television 16.9
sitting 14.6
lifestyle 13.7
telecommunication system 13.6
room 13.5
person 13.4
water 13.3
blackboard 13.1
old 12.5
couple 12.2
senior 12.2
outdoors 11.9
leisure 11.6
bathroom 11.1
happy 10.6
adult 10.4
smiling 10.1
aged 9.9
vintage 9.9
tub 9.8
portrait 9.7
men 9.4
newspaper 8.8
sit 8.5
black 8.4
modern 8.4
relaxation 8.4
mature 8.4
fun 8.2
retro 8.2
relaxing 8.2
active 8.1
family 8
negative 7.9
together 7.9
window 7.8
ancient 7.8
art 7.7
shop 7.7
health 7.6
bath 7.6
one 7.5
indoor 7.3
wet 7.1
hair 7.1
women 7.1
love 7.1
happiness 7
product 7
sky 7

Microsoft
created on 2022-02-26

text 99.6
window 85.2
tableware 73.7
table 61.2
sink 56.8
old 48.9

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 99.9%
Calm 33.9%
Surprised 30.6%
Happy 18.7%
Sad 7%
Angry 3.3%
Fear 2.3%
Confused 2.2%
Disgusted 2%

AWS Rekognition

Age 20-28
Gender Male, 98.4%
Calm 47.9%
Happy 25.4%
Sad 11.4%
Surprised 6.3%
Fear 3.5%
Angry 2.8%
Disgusted 1.4%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 99.2%
Helmet 73.2%

Captions

Microsoft

a group of people standing in front of a window 68.3%
a group of people in front of a window 68.2%
a group of people sitting at a table in front of a window 56.6%

Text analysis

Amazon

7
KODAK--ITW

Google

MJI7--
--
MJI7-- YT37A°2 -- XAGO
YT37A°2
XAGO