Human Generated Data

Title

Untitled (group of women marching down residential street with car in parade)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8945

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of women marching down residential street with car in parade)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Car 99.6
Automobile 99.6
Vehicle 99.6
Transportation 99.6
Person 99.2
Human 99.2
Person 99.2
Person 99.1
Person 99
Person 97.9
Person 97.6
Person 97.5
Person 96.3
Machine 95.4
Wheel 95.4
Person 95.1
Person 90.6
Car 89.5
Pedestrian 89.1
Asphalt 86.4
Tarmac 86.4
Person 83.7
Apparel 80.7
Clothing 80.7
Person 80.5
Person 79.7
Spoke 72.7
Shorts 72.4
Person 70.4
People 70
Road 65.6
Person 62.1
Outdoors 61.8
Alloy Wheel 61.2
Tire 60.7
Car Wheel 60.3
Crowd 60.1
Caravan 58.6
Van 58.6
Coupe 58.4
Sports Car 58.4
Path 56.3
Person 46.6

Imagga
created on 2022-01-09

people 24
person 19.1
athlete 18.7
group 17.7
male 17.7
sport 17.1
runner 16.9
travel 15.5
men 15.4
man 15.4
outdoors 15.3
silhouette 13.2
business 11.5
sky 11.5
contestant 11.4
adult 11.4
active 11.3
fun 11.2
summer 10.9
crowd 10.6
day 10.2
beach 10.1
water 10
city 10
leisure 10
sunset 9.9
run 9.6
women 9.5
happiness 9.4
lifestyle 9.4
action 9.3
spectator 9.1
transportation 9
grass 8.7
boy 8.7
walking 8.5
friends 8.4
field 8.4
back 8.3
transport 8.2
brass 8.2
leg 8.1
happy 8.1
world 8.1
activity 8.1
smiling 8
motion 7.7
dancer 7.7
performer 7.7
old 7.7
outdoor 7.6
human 7.5
friendship 7.5
ocean 7.5
building 7.5
tourism 7.4
sports equipment 7.4
vacation 7.4
life 7.3
exercise 7.3
office 7.2
black 7.2
clothing 7.1
vehicle 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98
person 97.2
road 97.2
clothing 96.8
black and white 94.2
man 80
dance 80
street 70.7
footwear 68.2
woman 62.1
people 57.7

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 77.7%
Fear 69.9%
Calm 15.3%
Disgusted 5%
Sad 3.3%
Surprised 2.3%
Confused 1.7%
Happy 1.4%
Angry 1.1%

AWS Rekognition

Age 24-34
Gender Male, 99.7%
Calm 61.8%
Angry 14.9%
Confused 9.1%
Sad 5.3%
Happy 3.8%
Disgusted 2.4%
Surprised 1.6%
Fear 1.1%

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Calm 99.2%
Confused 0.3%
Fear 0.2%
Surprised 0.1%
Sad 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Male, 91.5%
Calm 75.2%
Surprised 14.8%
Happy 4.4%
Fear 3.6%
Disgusted 0.7%
Confused 0.6%
Sad 0.5%
Angry 0.3%

AWS Rekognition

Age 39-47
Gender Male, 61.2%
Calm 46%
Fear 29.5%
Happy 12.3%
Angry 3.7%
Disgusted 2.8%
Sad 2.8%
Confused 1.5%
Surprised 1.4%

AWS Rekognition

Age 19-27
Gender Female, 62.1%
Happy 41.1%
Sad 16.2%
Fear 15.2%
Angry 14.2%
Calm 5.8%
Disgusted 2.9%
Confused 2.8%
Surprised 1.9%

AWS Rekognition

Age 18-24
Gender Female, 98.1%
Calm 28.9%
Happy 28.5%
Fear 22.5%
Surprised 11%
Angry 4.3%
Disgusted 1.8%
Sad 1.6%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Car 99.6%
Person 99.2%
Wheel 95.4%

Captions

Microsoft

a group of people standing in front of a building 79.3%
a group of people in front of a building 79.2%
a group of people standing in front of a store 65.5%