Human Generated Data

Title

Untitled (portrait of five young adults seated on motor scooters)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2858

Human Generated Data

Title

Untitled (portrait of five young adults seated on motor scooters)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Clothing 99.5
Apparel 99.5
Person 99.2
Human 99.2
Person 99
Person 98.5
Play 98.1
Shorts 96.9
Female 96.5
Person 95.9
Plant 94
Grass 94
Person 93.7
Dress 93
Outdoors 92.9
Tree 92.2
Lawn 89.5
Park 89.5
Yard 87.6
Nature 87.6
Military 83.6
Child 83.5
Kid 83.5
Woman 82
Face 79.8
Girl 79.7
Military Uniform 79.1
Chair 75.3
Furniture 75.3
Machine 74.9
Wheel 74.9
People 74.7
Shoe 74.3
Footwear 74.3
Teen 73.8
Blonde 71.5
Soldier 68.5
Photography 68.3
Portrait 68.3
Photo 68.3
Vegetation 65.5
Shoe 63.7
Leisure Activities 62.9
Army 62.3
Armored 62.3
Food 61.1
Meal 61.1
Skirt 60
Man 59.7
Officer 59.2
Coat 58.4
Boy 57.5
Pants 56.7
Overcoat 56.6
Suit 56.6
Vacation 56.2
Swimwear 55
Wheel 54

Imagga
created on 2022-01-16

sport 26
person 25.7
pedestrian 24.4
adult 21.6
outdoors 21.3
people 19
athlete 18.4
outdoor 18.3
crutch 18.2
man 18.1
attractive 16.8
happy 14.4
staff 14.1
dancer 13.8
summer 13.5
ball 13.4
portrait 12.9
lifestyle 12.3
fashion 12.1
body 12
fun 12
pretty 11.9
dress 11.7
leisure 11.6
stick 11.3
male 11.3
sexy 11.2
street 11
performer 11
exercise 10.9
women 10.3
youth 10.2
competition 10.1
active 10.1
park 10
recreation 9.9
style 9.6
couple 9.6
boy 9.6
day 9.4
teenager 9.1
playing 9.1
fitness 9
black 9
vacation 9
one 9
child 8.8
together 8.8
run 8.7
smiling 8.7
life 8.6
happiness 8.6
sitting 8.6
outside 8.6
smile 8.5
runner 8.3
lady 8.1
activity 8.1
grass 7.9
love 7.9
model 7.8
play 7.8
musical instrument 7.7
two 7.6
entertainer 7.6
joy 7.5
action 7.4
sunset 7.2
game 7.1

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

outdoor 99.6
text 96
footwear 95.4
person 80.9
clothing 77.9
black and white 60.4
posing 48.1

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Calm 76%
Happy 7.8%
Sad 6.4%
Confused 4.4%
Disgusted 2.2%
Fear 1.3%
Angry 0.9%
Surprised 0.8%

AWS Rekognition

Age 20-28
Gender Female, 95.8%
Calm 89.7%
Fear 3.1%
Sad 1.8%
Disgusted 1.8%
Surprised 1.4%
Happy 0.9%
Angry 0.8%
Confused 0.5%

AWS Rekognition

Age 24-34
Gender Male, 98.3%
Calm 90.1%
Surprised 4.7%
Sad 2.1%
Happy 1%
Fear 0.7%
Disgusted 0.5%
Confused 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Wheel 74.9%
Shoe 74.3%

Captions

Microsoft

a group of people posing for a photo 89.8%
a group of people posing for a picture 89.7%
a group of people posing for the camera 89.6%

Text analysis

Google

YT3RA2-XAGON
YT3RA2-XAGON