Human Generated Data

Title

Untitled (people gathered around pool, Bird Key Yacht Club, Florida)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11486

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people gathered around pool, Bird Key Yacht Club, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11486

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.3
Human 99.3
Person 98.5
Person 97.2
Person 92.8
Person 89.3
Nature 87
Person 86.3
Person 85.9
Outdoors 80.7
Person 79.2
Person 78.5
Vehicle 67.4
Transportation 67.4
People 65.1
Water 64.6
Crowd 63.8
Building 61.9
Person 60.1
Harbor 59.1
Pier 59.1
Dock 59.1
Port 59.1
Waterfront 59.1

Clarifai
created on 2023-10-26

people 99.8
many 99
group together 99
group 98
adult 97.2
man 96.8
vehicle 95.9
war 95.8
crowd 94.1
monochrome 93.1
military 90.7
soldier 90.1
home 89.1
furniture 88.3
seat 88.1
transportation system 87.9
spectator 86.2
railway 84.6
administration 82.7
no person 82.6

Imagga
created on 2022-01-14

half track 100
vehicle 100
military vehicle 90.5
tracked vehicle 90.3
wheeled vehicle 49.3
conveyance 48
machine 37.8
tractor 29.8
thresher 27.7
old 27.2
farm machine 25.9
transportation 25.1
farm 24.1
rural 22.9
transport 20.1
agriculture 18.4
field 16.7
grass 16.6
sky 16.6
machinery 15.6
car 15.1
dirt 14.3
rusty 14.3
wheel 14.2
landscape 14.1
work 14.1
tank 13.9
device 13.6
truck 13.4
antique 13
summer 12.9
industry 12.8
field artillery 12.2
tire 11.9
artillery 11.9
industrial 11.8
track 11.5
bulldozer 11.5
equipment 11.4
farmer 11
road 10.8
plow 10.8
tree 10.8
wheels 10.8
outdoor 10.7
wagon 10.7
outdoors 10.5
abandoned 9.8
driving 9.7
metal 9.7
engine 9.6
farming 9.5
construction 9.4
iron 9.3
power 9.2
house 9.2
hay 9.1
countryside 9.1
vintage 9.1
danger 9.1
working 8.8
agricultural 8.8
building 8.7
armament 8.7
broken 8.7
auto 8.6
heavy 8.6
drive 8.5
harvester 8.5
crop 8.5
land 8.3
autumn 7.9
country 7.9
wreck 7.9
architecture 7.8
dust 7.8
rust 7.7
stone 7.7
historical 7.5
harvest 7.5
environment 7.4
man 7.4
locomotive 7.3
yellow 7.3
black 7.2
meadow 7.2
history 7.2
travel 7

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 98.7
cemetery 83.9
black and white 82.1
grave 81.1
black 77.2
house 72
white 71.9
sky 63.8
old 59.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Male, 82.5%
Calm 75.8%
Disgusted 11.7%
Angry 5.5%
Sad 2.6%
Fear 1.6%
Surprised 1.3%
Happy 0.9%
Confused 0.6%

AWS Rekognition

Age 16-24
Gender Female, 73.4%
Calm 95.3%
Sad 1.8%
Happy 0.9%
Surprised 0.7%
Fear 0.6%
Disgusted 0.4%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Female, 55.8%
Calm 99.5%
Sad 0.2%
Happy 0.1%
Disgusted 0%
Angry 0%
Fear 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 22-30
Gender Female, 82.6%
Sad 57.6%
Calm 22.7%
Fear 13.5%
Angry 4.5%
Happy 0.7%
Disgusted 0.6%
Confused 0.2%
Surprised 0.2%

AWS Rekognition

Age 16-22
Gender Male, 60.1%
Calm 65%
Sad 11.4%
Disgusted 7.5%
Confused 6.2%
Fear 2.8%
Happy 2.8%
Angry 2.4%
Surprised 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.3%

Text analysis

Amazon

MJI7--Y
MJI7--Y E6E9+
E6E9+

Google

EEEE et63.93 MJ」ヨー-Y
EEEE
et63.93
MJ
-Y