Human Generated Data

Title

Untitled (girl standing on wall fire hydrant)

Date

c. 1950, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.208

Human Generated Data

Title

Untitled (girl standing on wall fire hydrant)

People

Artist: Jack Gould, American

Date

c. 1950, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 100
Clothing 100
Coat 100
Overcoat 98.2
Person 98.1
Human 98.1
Person 97.5
Person 95.9
Person 91.6
Suit 86.6
Sleeve 69.5
Shoe 65.8
Footwear 65.8
Shoe 63.2
Skateboard 62.5
Sport 62.5
Sports 62.5
Tarmac 56.3
Asphalt 56.3

Imagga
created on 2021-12-14

business 36.4
people 31.8
man 26.2
briefcase 24.9
corporate 24.1
businessman 22.1
male 22.1
adult 21.4
city 20.8
urban 20.1
men 19.8
group 19.3
suit 18.4
black 17.6
person 16.6
silhouette 16.6
office 16
women 15.8
professional 15.4
attractive 15.4
building 14.5
success 14.5
happy 13.8
fashion 13.6
work 13.3
window 12.9
team 12.5
walk 12.4
walking 12.3
meeting 12.3
standing 12.2
architecture 12.1
world 11.6
crowd 11.5
executive 11.2
businesswoman 10.9
job 10.6
refrigerator 10.5
portrait 10.4
manager 10.2
gate 10.2
blur 10.2
company 10.2
weapon 10.1
reflection 9.8
human 9.7
mall 9.7
shop 9.5
career 9.5
motion 9.4
device 9.4
teamwork 9.3
street 9.2
travel 9.2
clothing 9.2
bag 9.1
modern 9.1
worker 9
working 8.8
airport 8.8
boss 8.6
wall 8.6
smile 8.5
white goods 8.5
pretty 8.4
shopping 8.4
looking 8
lifestyle 7.9
happiness 7.8
hands 7.8
silhouettes 7.8
sales 7.7
life 7.6
legs 7.5
successful 7.3
interior 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

clothing 96.4
black and white 94.9
person 89.5
text 89
coat 80.1
footwear 68.6
street 64.5
monochrome 62.8
white 61.8
jacket 59.5

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Female, 83.3%
Disgusted 69.2%
Sad 14.2%
Calm 6.7%
Fear 3.3%
Angry 3%
Confused 2.7%
Surprised 0.5%
Happy 0.4%

AWS Rekognition

Age 24-38
Gender Female, 90%
Sad 46.9%
Fear 26.7%
Calm 10.9%
Happy 5.9%
Angry 5.4%
Disgusted 1.7%
Surprised 1.4%
Confused 1.1%

AWS Rekognition

Age 11-21
Gender Female, 89.8%
Sad 61%
Fear 20.4%
Angry 8.4%
Calm 5%
Confused 3.1%
Surprised 1.6%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 19-31
Gender Female, 65.9%
Sad 50.8%
Calm 27.2%
Fear 12.7%
Happy 2.5%
Confused 2.1%
Surprised 2%
Angry 1.7%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Coat 100%
Person 98.1%
Shoe 65.8%
Skateboard 62.5%

Captions

Microsoft

a group of people posing for a photo 74.3%
a group of people posing for the camera 74.2%
a person posing for a photo 74.1%