Human Generated Data

Title

Untitled (dolphin jumping out of water to take fish from woman)

Date

1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8130

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (dolphin jumping out of water to take fish from woman)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8130

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.3
Water 99.2
Person 98.2
Outdoors 97.2
Nature 96.5
Person 95.9
Person 81.5
Waterfront 72
Pier 68.9
Dock 68.9
Port 68.9
Back 67.7
Land 66.9
People 63.7
Silhouette 62.8
Transportation 61.9
Vehicle 61.9
Shorts 60.5
Clothing 60.5
Apparel 60.5
Boat 57.4
Waterfowl 56.6
Animal 56.6
Bird 56.6
Oars 56.5
Kid 55.1
Child 55.1
Ice 55.1

Clarifai
created on 2023-10-26

people 99.9
group 99.4
water 98.2
group together 97.7
monochrome 97.4
river 97
child 96.5
adult 95.7
recreation 95.5
lake 94.5
two 93.2
watercraft 93.1
man 92.2
several 90
woman 89.4
reflection 89.1
vehicle 88.1
beach 88
three 86.9
family 86.5

Imagga
created on 2022-01-15

shopping cart 34.2
percussion instrument 29.3
handcart 29.1
vibraphone 25.7
musical instrument 24.3
wheeled vehicle 21.7
water 17.4
sky 16.7
silhouette 16.6
device 15.4
container 15.4
fountain 14.6
structure 14
people 13.4
sunset 12.6
landscape 11.9
man 11.4
sun 10.7
outdoor 10.7
male 10.6
swing 10.5
old 10.5
park 10.2
black 10.2
light 10
snow 9.7
person 9.7
holiday 9.3
clouds 9.3
beach 9.1
symbol 8.8
sand 8.7
work 8.7
men 8.6
winter 8.5
mechanical device 8.5
industrial 8.2
ocean 8.1
platform 8.1
building 8
sea 7.9
business 7.9
lake 7.9
day 7.8
season 7.8
tree 7.8
travel 7.7
plaything 7.7
conveyance 7.7
dark 7.5
ice 7.5
marimba 7.4
natural 7.4
reflection 7.3
mechanism 7.3
architecture 7

Google
created on 2022-01-15

Plant 91.6
Organism 86
Black-and-white 85.5
Style 83.8
Font 81.6
Adaptation 79.4
Art 78.4
Rectangle 75.5
Beauty 74.8
Monochrome photography 73.8
Monochrome 73.4
Room 71
Visual arts 70.4
Event 68.6
Darkness 67.9
Vintage clothing 65.4
Chair 65.4
Tree 65.2
Wetland 63.1
Photographic paper 62.9

Microsoft
created on 2022-01-15

text 99.7
water 78.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 98%
Calm 55.2%
Sad 41.2%
Fear 1.2%
Happy 0.9%
Disgusted 0.5%
Angry 0.4%
Confused 0.3%
Surprised 0.3%

AWS Rekognition

Age 24-34
Gender Male, 87.7%
Calm 43.4%
Happy 35.9%
Disgusted 8.9%
Surprised 3.7%
Angry 2.9%
Confused 2.5%
Fear 1.5%
Sad 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft
created on 2022-01-15

a person standing in front of water 36.1%

Text analysis

Amazon

KODVR
SLT15
-
VINTAGE

Google

51275
51275