Human Generated Data

Title

Untitled (people around pool at Longboat Key Towers, FL)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11433

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people around pool at Longboat Key Towers, FL)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11433

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.5
Human 99.5
Person 97.6
Person 95.1
Chair 93.8
Furniture 93.8
Shorts 93.4
Clothing 93.4
Apparel 93.4
Nature 88.1
Road 86
Outdoors 82.8
Building 82.2
Meal 77.2
Food 77.2
Water 73
People 69
Vehicle 68.1
Transportation 68.1
Architecture 66.9
Grass 64.9
Plant 64.9
Car 62.8
Automobile 62.8
Weather 62.8
Tarmac 59.4
Asphalt 59.4
Waterfront 58.6
Urban 57.5
Villa 57.1
Housing 57.1
House 57.1
Sedan 57
Freeway 55.5
Pool 55.1

Clarifai
created on 2023-10-25

people 99.9
adult 98.8
group together 98.5
man 97.9
two 97.8
vehicle 97.5
monochrome 95.7
woman 95.5
recreation 95.3
furniture 95.2
group 93.5
water 93.4
watercraft 92.8
reclining 92.6
child 92
seat 91.9
chair 89.9
transportation system 89.4
beach 89.2
one 88.4

Imagga
created on 2022-01-14

chairlift 33
ski tow 26.5
sky 21
ship 20.5
conveyance 19.7
travel 19
water 18.7
sea 18
ocean 16.1
vessel 14.1
tourism 12.4
vacation 12.3
structure 11.9
boat 11.9
technology 11.9
architecture 11.7
equipment 11.2
summer 10.9
building 10.6
landscape 10.4
liner 10.4
beach 10.3
business 10.3
black 9.6
vehicle 9.5
cloud 9.5
device 9.5
work 9.4
construction 9.4
power 9.2
speed 9.2
deck 9.2
transport 9.1
design 9
negative 9
transportation 9
digital 8.9
urban 8.7
monitor 8.6
film 8.3
city 8.3
tower 8.1
light 8
cockpit 7.9
high 7.8
sport 7.8
3d 7.7
luxury 7.7
craft 7.7
industry 7.7
effects 7.6
energy 7.6
sign 7.5
three dimensional 7.5
man 7.4
support 7.3
graphics 7.3
warship 7.2
road 7.2
night 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 96
outdoor 94
black and white 90.8
old 89.4
ship 89.4
water 85.4
black 80.3
white 78.5
vintage 69.4
engine 25.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 93.6%
Calm 57.7%
Confused 15.8%
Fear 11.4%
Sad 7.8%
Disgusted 2.8%
Happy 2.5%
Surprised 1.1%
Angry 1%

AWS Rekognition

Age 23-31
Gender Male, 75.8%
Calm 99.7%
Sad 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0%
Disgusted 0%
Confused 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

interior objects 93.9%
cars vehicles 3.7%

Text analysis

Amazon

HAGOY
COVEELA--EIIW