Human Generated Data

Title

Untitled (man and woman looking at blueprints in car)

Date

1954, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.206

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman looking at blueprints in car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1954, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Plant 99.2
Palm Tree 99.2
Arecaceae 99.2
Tree 99.2
Person 98.6
Transportation 86
Vehicle 84
Automobile 79.3
Car 79.3
Clothing 73
Apparel 73
Person 64.9
Waterfront 58
Dock 58
Pier 58
Port 58
Water 58
Boat 57.8

Imagga
created on 2022-01-08

boat 35.1
water 30
man 26.9
sea 26.6
ocean 26.6
travel 25.3
car 21.8
vehicle 21.3
people 21.2
outdoors 19.6
transportation 18.8
summer 18.6
vacation 18
adult 17.5
beach 16.9
helm 16.5
passenger 16.5
ship 16.4
sitting 16.3
vessel 16.2
male 15.8
person 15.2
lifestyle 15.2
sky 14.7
outdoor 14.5
harbor 14.4
engineer 14.3
transport 13.7
motor vehicle 13.6
business 13.4
happiness 13.3
tourism 13.2
happy 13.2
holiday 12.9
luxury 12.9
fishing 12.5
automobile 12.4
steering system 12.4
smiling 12.3
couple 12.2
outside 12
dock 11.7
coast 11.7
fun 11.2
laptop 10.9
communication 10.9
device 10.8
tourist 10.4
shore 10.4
relaxation 10
relaxing 10
city 10
mechanism 9.9
river 9.8
boats 9.7
computer 9.6
craft 9.4
bay 9.4
work 9.4
day 9.4
professional 9.3
relax 9.3
street 9.2
lake 9.2
looking 8.8
sailing 8.8
sail 8.7
driver 8.7
full length 8.7
building 8.6
auto 8.6
idyllic 8.5
vacations 8.5
enjoyment 8.4
holidays 8.4
attractive 8.4
island 8.2
landscape 8.2
cheerful 8.1
handsome 8
job 8
businessman 7.9
equipment 7.9
20 24 years 7.9
yacht 7.8
nautical 7.8
seller 7.7
tropical 7.7
two 7.6
resort 7.5
park 7.5
mature 7.4
technology 7.4
color 7.2
golf equipment 7.2
marina 7.1
family 7.1
wheeled vehicle 7.1
scenic 7
together 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 98.3
text 97.9
person 97.7
black and white 94.8
man 91.2
vehicle 70.5
boat 63.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 99.4%
Calm 98.3%
Sad 1.1%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 48-56
Gender Female, 100%
Confused 63.8%
Happy 19.2%
Calm 8.6%
Fear 2.2%
Angry 1.9%
Sad 1.5%
Disgusted 1.4%
Surprised 1.4%

Microsoft Cognitive Services

Age 55
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Boat 57.8%

Captions

Microsoft

Brownie Wise sitting on a boat 77.9%
Brownie Wise riding on the back of a boat 65.5%
Brownie Wise standing in front of a boat 65.4%