Human Generated Data

Title

Untitled (men holding tail of shark on dock)

Date

1957

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10413

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men holding tail of shark on dock)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.1
Water 96.6
Wood 96.1
Waterfront 95
Pier 92.5
Dock 92.5
Port 92.5
Dog 88.2
Mammal 88.2
Animal 88.2
Canine 88.2
Pet 88.2
Clothing 85.9
Apparel 85.9
Shoe 74.7
Footwear 74.7
Hardwood 72.1
Building 68.5
Flooring 67.4
Person 66.1
Plywood 65.2
Floor 62.9
Boardwalk 62.5
Bridge 62.5
Pants 60.8
Shorts 59.8
Sleeve 59
Housing 59
Back 56
Shoe 54.5

Imagga
created on 2022-01-09

man 34.9
device 29
male 25.5
laptop 20.2
lifestyle 19.5
people 19
technology 17.8
equipment 17.8
worker 17.8
building 17.4
work 17.3
person 17.1
industry 17.1
business 17
construction 16.2
job 15.9
working 15.9
industrial 14.5
machine 14.3
computer 13.1
adult 12.9
men 12.9
businessman 12.4
professional 11.8
handsome 11.6
engineer 11.3
happy 11.3
office 11.2
outdoors 11.2
steel 10.9
smiling 10.8
labor 10.7
day 10.2
communication 10.1
water 10
full length 9.7
support 9.5
sitting 9.4
happiness 9.4
casual 9.3
safety 9.2
house 9.2
fitness 9
iron 8.4
keyboard 8.4
site 8.4
occupation 8.2
vacation 8.2
active 8.1
builder 8
to 8
leisure activity 7.8
couple 7.8
portrait 7.8
travel 7.7
corporate 7.7
treadmill 7.7
chair 7.7
vacations 7.5
senior 7.5
cheerful 7.3
relaxing 7.3
exercise 7.3
black 7.2
holiday 7.2
sea 7
exercise device 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 90.3
clothing 85.8
standing 83.3
black and white 78.9
ship 52.6
posing 43.9

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Sad 57.7%
Calm 34.2%
Confused 5.9%
Surprised 0.6%
Happy 0.5%
Disgusted 0.5%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Happy 64.3%
Surprised 11.6%
Confused 9.8%
Calm 6.3%
Sad 3.9%
Angry 1.9%
Disgusted 1.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Dog 88.2%
Shoe 74.7%

Captions

Microsoft

a group of people standing around a bench 73.9%
a group of people posing for a photo 73.8%
a group of people posing for a picture 73.7%

Text analysis

Amazon

ELSTH
MJ17--YT37A- -A

Google

KODVK--2.LEL
KODVK--2.LEL