Human Generated Data

Title

Untitled (man feeding dolphin, spectators on dock, Floridaland)

Date

c. 1955, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12222

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man feeding dolphin, spectators on dock, Floridaland)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12222

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.8
Human 99.8
Person 99.5
Person 99.5
Person 99.3
Water 91.9
Plant 82.9
Tree 79.1
Palm Tree 78
Arecaceae 78
Waterfront 70.6
Outdoors 70.1
Dock 68.1
Pier 68.1
Port 68.1
Animal 62.9
Sea Life 62.9
Nature 57.2
Transportation 55.2
Vehicle 55.2

Clarifai
created on 2019-11-16

people 99.9
group 99.2
adult 97.4
man 97.4
group together 97.3
watercraft 97.2
vehicle 96.5
many 96
water 95.7
woman 92.5
recreation 90.5
two 90.1
music 89.2
several 88.6
transportation system 87.6
travel 87.4
no person 86.5
beach 86.5
wear 86.3
boat 85.7

Imagga
created on 2019-11-16

percussion instrument 36.3
musical instrument 35
water 20.7
man 18.8
shopping cart 18.5
steel drum 18.2
sky 17.2
sunset 16.2
silhouette 15.7
handcart 15.6
people 15.1
male 14.2
beach 12.7
landscape 12.6
wheeled vehicle 12.6
black 12
tree 11.8
sea 11.7
relaxation 11.7
device 11.4
lake 11.3
outdoors 11.2
stringed instrument 10.8
marimba 10.6
reflection 10.6
sitting 10.3
container 10.3
sun 10
travel 9.9
trees 9.8
couple 9.6
dusk 9.5
day 9.4
ocean 9.3
person 9.2
vibraphone 9.2
business 9.1
building 8.8
lifestyle 8.7
men 8.6
outdoor 8.4
summer 8.4
leisure 8.3
tourism 8.2
transportation 8.1
light 8
river 8
sand 7.9
fisherman 7.7
construction 7.7
old 7.7
tropical 7.7
work 7.6
dark 7.5
evening 7.5
shore 7.4
island 7.3
horizon 7.2
rural 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

sky 98.5
text 98.4
boat 94.9
outdoor 93
watercraft 92.7
water 92.4
ship 89.4
lake 86
vehicle 55.1
several 13.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Male, 54.9%
Fear 45.1%
Angry 47%
Sad 46.3%
Happy 45.1%
Calm 51.5%
Confused 45%
Surprised 45.1%
Disgusted 45%

AWS Rekognition

Age 31-47
Gender Male, 55%
Surprised 45.2%
Fear 45.5%
Disgusted 45.1%
Calm 48.1%
Angry 49.1%
Confused 45.2%
Happy 46.2%
Sad 45.6%

AWS Rekognition

Age 26-40
Gender Female, 54.9%
Sad 45%
Surprised 45%
Confused 45%
Fear 45%
Angry 45%
Happy 47.8%
Calm 52.1%
Disgusted 45%

AWS Rekognition

Age 33-49
Gender Female, 50.5%
Angry 49.6%
Disgusted 49.7%
Surprised 49.5%
Confused 49.6%
Fear 49.6%
Calm 49.7%
Sad 49.7%
Happy 49.6%

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

cars vehicles 97.7%

Text analysis

Amazon

57457

Google

5.57
5.57