Human Generated Data

Title

Untitled (couple seated in front seat of car, man waving)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8172

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple seated in front seat of car, man waving)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 94.9
Human 94.9
Cushion 91.5
Face 90.8
Clothing 87.3
Apparel 87.3
Automobile 84.7
Car 84.7
Vehicle 84.7
Transportation 84.7
Outdoors 79.7
Person 75.6
Nature 73.4
Portrait 73.4
Photo 73.4
Photography 73.4
Person 67.5
Bridegroom 66.6
Wedding 66.6
Plant 62.3
Vegetation 62.3
Smile 57.1

Imagga
created on 2022-01-08

cockpit 100
car 81.5
vehicle 53.1
driver 51.5
automobile 46.9
transportation 44.8
driving 36.7
person 36.4
auto 36.4
drive 35.9
sitting 31.8
seat 29.8
transport 29.2
aviator 28.2
people 27.3
adult 26.5
wheel 25.5
road 24.4
smile 24.2
happy 23.8
man 23.5
portrait 22
happiness 20.4
new 20.2
inside 18.4
attractive 17.5
window 17.4
smiling 17.4
male 16.3
travel 16.2
pretty 16.1
device 15.3
support 15.3
fashion 15.1
cheerful 13.8
hand 13.7
one 13.4
traffic 13.3
business 12.8
motor 12.6
outdoors 11.9
speed 11.9
hair 11.9
casual 11.9
model 11.7
fun 11.2
steering 10.9
looking 10.4
luxury 10.3
20s 10.1
face 9.9
human 9.7
black 9.6
cute 9.3
mirror 9.2
joy 9.2
modern 9.1
interior 8.8
couple 8.7
insurance 8.7
engine 8.7
men 8.6
elegant 8.6
mature 8.4
summer 8.4
suit 8.1
machine 7.7
professional 7.6
horizontal 7.5
holding 7.4
safety 7.4
hat 7.3
color 7.2
sexy 7.2
handsome 7.1
women 7.1
work 7.1
day 7.1

Microsoft
created on 2022-01-08

text 98.2
car 97.6
book 97.4
black and white 91.7
outdoor 86.5
person 76.3
clothing 70.4
man 55.7
van 22.9
clothes 18.3

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 91.8%
Happy 99.4%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Calm 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 96.3%
Happy 43.4%
Confused 36.1%
Calm 10.2%
Angry 3.6%
Sad 2.9%
Fear 1.9%
Disgusted 1%
Surprised 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.9%
Car 84.7%

Captions

Microsoft

a man sitting in a car 63.4%
a man sitting on top of a car 57.5%
a man riding on the back of a car 48%

Text analysis

Amazon

85.
390 85.
390
DUD
YТ3A-AX

Google

85.
4.
4. 390 85.
390