Human Generated Data

Title

Untitled (two ladies in library)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17269

Human Generated Data

Title

Untitled (two ladies in library)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.8
Human 98.8
Person 98.1
Room 94.9
Indoors 94.9
Furniture 90.2
Library 89
Book 89
Apparel 85.3
Clothing 85.3
Bookcase 83
Shelf 80.6
Female 64.9

Imagga
created on 2022-02-26

turnstile 66.6
gate 65
prison 52.3
cell 49
movable barrier 40.1
correctional institution 39.7
penal institution 29.8
barrier 26.9
people 26.2
business 26.1
building 21.8
man 21.5
male 20.6
institution 19.8
person 17.8
window 17.6
urban 17.5
city 17.5
adult 16.2
men 15.5
modern 15.4
architecture 14.2
interior 14.1
office 13.7
obstruction 13.6
entrance 13.5
women 13.4
corporate 12.9
crowd 12.5
silhouette 12.4
inside 12
portrait 11.6
businessman 11.5
walking 11.4
fashion 11.3
standing 11.3
travel 11.3
transportation 10.8
airport 10.7
indoors 10.5
work 10.2
establishment 10.2
indoor 10
happy 10
to 9.7
walk 9.5
casual 9.3
attractive 9.1
old 9.1
group 8.9
corridor 8.8
looking 8.8
full length 8.7
journey 8.5
professional 8.4
suit 8.2
businesswoman 8.2
black 8.2
light 8
hall 8
steel 8
passenger 7.9
departure 7.9
luggage 7.9
boy 7.8
scene 7.8
glass 7.8
station 7.7
wall 7.7
industry 7.7
boss 7.6
human 7.5
outdoors 7.5
security 7.4
metal 7.2
success 7.2
lifestyle 7.2
worker 7.1
happiness 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99
person 92.6
clothing 89.4
black and white 89.1
book 85.4
dress 80.6
woman 78.9

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a man and a woman standing in a library 59.4%
a person standing in a library 59.3%
a person standing in front of a store 59.2%

Text analysis

Amazon

VIL
1