Human Generated Data

Title

Untitled (man and woman with two dogs and luggage by their car, Florida)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.326

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman with two dogs and luggage by their car, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Home Decor 99.8
Transportation 99.2
Car 99.2
Vehicle 99.2
Automobile 99.2
Person 98
Human 98
Person 95.8
Outdoors 88.2
Wheel 87.2
Machine 87.2
Tire 82.5
Window 79.5
Furniture 76.9
Car Wheel 68.8
Curtain 63.6
Nature 61.2
Tractor 57.8
Spoke 57.4
Window Shade 57.3
Plant 56.6
Shutter 55.8
Vegetation 55.6
Wood 55.2

Imagga
created on 2022-02-26

car 100
beach wagon 63.4
motor vehicle 53
auto 46.9
transportation 44.8
vehicle 43.4
automobile 43.1
transport 33.8
speed 32
drive 30.3
wheel 24.7
road 24.4
fast 23.4
motor 21.3
power 20.1
luxury 19.7
wheeled vehicle 19.1
travel 18.3
cars 17.6
sports 17.6
sport 17.3
style 17.1
race 16.2
chrome 16
cab 15.9
driving 15.5
expensive 15.3
modern 14.7
engine 14.4
traffic 14.2
sky 14
old 13.9
wheels 13.7
design 13.5
tire 13
street 12.9
light 12.7
vintage 12.4
urban 12.2
classic 12.1
antique 11.8
sedan 11.8
model 11.7
door 11.4
new 11.3
yellow 10.6
metal 10.5
truck 9.9
orange 9.2
garage 9.1
city 9
retro 9
bumper 9
color 8.9
wreck 8.8
automotive 8.8
shiny 8.7
performance 8.6
motion 8.6
black 8.4
outdoor 8.4
reflection 8.4
land 8.3
metallic 8.3
tourism 8.2
landscape 8.2
headlight 8.1
insurance 7.7
highway 7.7
mirror 7.6
show 7.6
technology 7.4

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

land vehicle 97.4
vehicle 97.3
car 92.3
wheel 86
text 68.5
person 55.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Female, 100%
Calm 67.4%
Angry 21.5%
Confused 3%
Sad 2.5%
Disgusted 1.9%
Happy 1.9%
Surprised 1.2%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Happy 98.7%
Surprised 0.5%
Fear 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%
Calm 0.1%
Sad 0.1%

Microsoft Cognitive Services

Age 40
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Car 99.2%
Person 98%
Wheel 87.2%

Captions

Microsoft

a group of people riding on the back of a pickup truck 30.6%