Human Generated Data

Title

Untitled (couple and child on boat posed with a large lobster)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10595

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple and child on boat posed with a large lobster)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10595

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 98.8
Person 96.3
Face 95.8
Transportation 88.4
Vehicle 88.3
Female 87
Clothing 83.7
Apparel 83.7
Countryside 82.7
Rural 82.7
Nature 82.7
Shelter 82.7
Outdoors 82.7
Building 82.7
Car 78.8
Automobile 78.8
Airplane 77.3
Aircraft 77.3
Kid 75.7
Child 75.7
Meal 72.8
Food 72.8
Girl 71.6
Plant 67
Dress 66.8
Portrait 65.7
Photography 65.7
Photo 65.7
Woman 64.1
Shorts 63.3
Pool 62.2
Water 62.2
Tire 61.4
Yard 59.7
Logo 57.2
Symbol 57.2
Trademark 57.2
Machine 56.6
Crowd 56.5
Swimming Pool 56
Wheel 55.6

Clarifai
created on 2023-10-26

people 99.8
monochrome 99.1
vehicle 97.7
group together 97.3
adult 97.1
woman 97
man 95.6
transportation system 95.4
street 94.9
two 94.4
music 92.5
group 88.2
wear 87.7
three 86.6
portrait 86.5
watercraft 85.1
child 84.9
retro 82.1
aircraft 81.5
several 78.1

Imagga
created on 2022-01-09

vehicle 65.5
car 63.2
transportation 36.7
motor vehicle 31.1
auto 28.7
bobsled 28
automobile 27.8
seat 24.7
sled 22.4
device 21.4
wheeled vehicle 20.1
transport 20.1
drive 19.9
driver 19.4
man 18.8
support 16.9
conveyance 16.4
adult 16.2
driving 15.5
road 15.3
iron lung 15.3
wheel 15.1
male 14.9
speed 14.6
rumble 14.6
truck 14
people 13.9
happy 13.8
person 12.7
engine 12.5
amphibian 12.3
respirator 12.2
sitting 12
luxury 12
tramway 11.5
model t 11.4
traffic 11.4
travel 11.3
machine 11.2
passenger 11.1
inside 11
smile 10.7
modern 10.5
outdoors 10.4
smiling 10.1
rumble seat 10
motor 9.8
new 9.7
business 9.7
metal 9.6
happiness 9.4
breathing device 9.2
old 9.1
fun 9
accident 8.8
insurance 8.7
model 8.6
power 8.4
fire engine 8
work 7.8
station 7.7
summer 7.7
sky 7.7
casual 7.6
fast 7.5
technology 7.4
cheerful 7.3
danger 7.3
color 7.2
convertible 7
train 7

Microsoft
created on 2022-01-09

text 98.9
black and white 92.1
outdoor 86.5
vehicle 57.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 62.7%
Happy 69%
Calm 25.9%
Sad 1.9%
Angry 1.1%
Disgusted 0.9%
Surprised 0.6%
Fear 0.3%
Confused 0.3%

AWS Rekognition

Age 30-40
Gender Male, 96.4%
Happy 91.5%
Calm 3.1%
Surprised 1.5%
Disgusted 1.3%
Confused 1.1%
Angry 0.7%
Sad 0.4%
Fear 0.3%

AWS Rekognition

Age 37-45
Gender Male, 80.5%
Calm 95.1%
Surprised 2.6%
Happy 1.1%
Sad 0.4%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Airplane 77.3%

Categories

Imagga

cars vehicles 99.9%

Text analysis

Amazon

MARANN
MIAMI
26477
YT37A2-XAGOX

Google

MIAMI
26477
YT3R
XA
MIAMI 26477 YT3R XA