Human Generated Data

Title

Untitled (woman seated in car looking at blueprints held by a man)

Date

1954

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4684

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated in car looking at blueprints held by a man)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.4
Human 99.4
Person 98.4
Face 97.6
Clothing 96.6
Apparel 96.6
Transportation 84.5
Vehicle 84.5
Female 84.3
Water 79.8
Outdoors 78.9
Building 78.4
Photo 78.3
Portrait 78.3
Photography 78.3
Glasses 76.2
Accessory 76.2
Accessories 76.2
Nature 75.5
City 71.1
Urban 71.1
Town 71.1
Boat 69
Text 67.3
People 66.9
Coat 66.9
Suit 66.9
Overcoat 66.9
Tree 65.4
Plant 65.4
Woman 64.9
Architecture 62.1
Machine 62
Wheel 62
Pottery 60
Jar 60
Vase 60
Waterfront 60
Car 58.4
Automobile 58.4
Sports Car 57.8
Watercraft 57.5
Vessel 57.5
Man 56.9
Furniture 56.8
Dock 56.2
Pier 56.2
Port 56.2
High Rise 55.5
Porch 55.2

Imagga
created on 2022-02-05

car 40.7
passenger 26.5
automobile 25.8
vehicle 25.6
transportation 25.1
adult 23.3
sitting 23.2
newspaper 22.5
person 21.5
man 20.8
people 20.6
cockpit 20
driver 19.4
auto 19.1
product 16.6
business 16.4
happy 16.3
smile 15.7
transport 15.5
smiling 15.2
seat 14.9
male 14.2
drive 14.2
job 14.1
travel 13.4
wheel 13.2
office 12.9
creation 12.9
driving 12.6
work 12.6
working 12.4
laptop 11.8
road 11.7
cheerful 11.4
looking 11.2
portrait 11
engine 10.6
outdoors 10.4
happiness 10.2
lifestyle 10.1
motor vehicle 10
attractive 9.8
lady 9.7
computer 9.6
outside 9.4
two 9.3
professional 9.3
inside 9.2
occupation 9.2
20s 9.2
worker 9.1
pretty 9.1
urban 8.7
couple 8.7
building 8.6
modern 8.4
hand 8.4
support 8.3
speed 8.2
new 8.1
businessman 7.9
women 7.9
black 7.8
plane 7.7
corporate 7.7
men 7.7
airport 7.7
motion 7.7
one person 7.5
suit 7.5
one 7.5
technology 7.4
park 7.4
device 7.3
back 7.3
businesswoman 7.3
face 7.1
sky 7

Microsoft
created on 2022-02-05

text 99.7
person 97.4
black and white 95.7
ship 95.4
sky 72.9
monochrome 69.3
man 66.1
vehicle 60.6
water 59.3
clothing 56.3

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 98.4%
Happy 63.7%
Calm 32.7%
Surprised 1%
Sad 1%
Confused 0.5%
Fear 0.4%
Disgusted 0.4%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Wheel 62%

Captions

Microsoft

a person sitting in front of a building 73.7%
a person sitting in front of a building 73.6%
a man and a woman sitting in front of a building 51.9%

Text analysis

Amazon

39461.
KODAK-EITW

Google

39461.
39461.