Human Generated Data

Title

Untitled (woman on a lounge chair on the beach)

Date

1961

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11515

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman on a lounge chair on the beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.6
Human 99.6
Person 99.6
Person 99.3
Person 98.1
Camping 93
Tent 91.9
Airplane 90.4
Transportation 90.4
Vehicle 90.4
Aircraft 90.4
Person 88
Person 81.5
Person 75.2
Mountain Tent 56.5
Leisure Activities 56.5

Imagga
created on 2022-01-14

blackboard 66.2
monitor 39.8
television 36.6
electronic equipment 29.6
telecommunication system 27
equipment 25.7
technology 22.3
digital 17.8
light 14
black 12.6
night 12.4
three dimensional 12.2
computer 12.1
man 12.1
smoke 12.1
protection 11.8
industrial 11.8
3d 11.6
effects 11.4
water 11.3
graphics 11
people 10
dark 10
negative 9.9
sign 9.8
person 9.6
imagination 9.5
symbol 9.4
industry 9.4
safety 9.2
hand 9.1
human 9
information 8.9
destruction 8.8
work 8.6
business 8.5
mosquito net 8.5
travel 8.5
modern 8.4
transport 8.2
road 8.1
futuristic 8.1
render 7.8
factory 7.7
motion 7.7
construction 7.7
mask 7.7
city 7.5
silhouette 7.5
future 7.4
environment 7.4
film 7.4
design 7.3
global 7.3
danger 7.3
trees 7.1
sky 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 99.1
black and white 95.6
outdoor 88
water 84.1
monochrome 71.4
sky 55.1
several 10.3

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 92.5%
Calm 60.3%
Fear 11.6%
Sad 9.2%
Angry 7.6%
Disgusted 6.2%
Surprised 2.1%
Happy 1.9%
Confused 1.2%

AWS Rekognition

Age 19-27
Gender Male, 56.1%
Calm 52.1%
Fear 16%
Happy 9%
Disgusted 7.1%
Confused 5.9%
Surprised 3.5%
Sad 3.4%
Angry 3.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Airplane 90.4%

Captions

Microsoft

a group of people on a boat 48.1%
a group of people in a boat 48%
a group of people on a boat looking at the camera 33.6%

Text analysis

Amazon

472478
MJ17--YT37A°--XAX

Google

4 724 7 B
4
724
7
B