Human Generated Data

Title

Untitled (boy diving off board into pool)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8847

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (boy diving off board into pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8847

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.1
Human 99.1
Bird 94.6
Animal 94.6
People 84.3
Person 76.9
Housing 73.7
Building 73.7
Person 67.3
House 56
Villa 55.8
Team Sport 55
Team 55
Sport 55
Sports 55
Person 45.3

Clarifai
created on 2023-10-26

people 99.9
group together 98.7
adult 98
monochrome 97.3
man 96.1
many 95.6
bench 95.4
group 95.1
furniture 94.2
seat 92.8
woman 92.4
two 90.7
child 90.2
wear 88.9
recreation 88.5
one 88.2
sports equipment 87.3
several 86.5
home 86.3
cavalry 85.1

Imagga
created on 2022-01-15

sky 26.1
landscape 25.3
bench 25.2
snow 24.8
trampoline 24.7
structure 24.5
tree 24.1
billboard 23.9
stage 23.7
park bench 23.2
gymnastic apparatus 21.7
winter 19.6
trees 18.7
signboard 18.5
platform 18.4
seat 17.9
scene 17.3
sports equipment 16.5
cold 15.5
water 15.3
old 15.3
outdoor 15.3
weather 15.2
park 14.8
night 14.2
season 14
travel 13.4
building 12.9
city 12.5
light 12
black 12
furniture 11.9
sunset 11.7
forest 11.3
equipment 11
scenery 10.8
lonely 10.6
rural 10.6
cloud 10.3
lake 10.2
architecture 10.2
sun 9.7
skyline 9.5
clouds 9.3
television 9.3
dark 9.2
alone 9.1
fog 8.7
sunrise 8.4
evening 8.4
field 8.4
lights 8.3
wood 8.3
shopping cart 8.3
fence 8.2
road 8.1
cool 8
urban 7.9
freeze 7.8
wheeled vehicle 7.8
outside 7.7
ocean 7.5
vacation 7.4
peace 7.3
countryside 7.3
branch 7.3
color 7.2
beach 7.2
horizon 7.2
history 7.2
river 7.1
grass 7.1
ice 7.1
summer 7.1
scenic 7
seasonal 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.6
black and white 95.1
tree 94.7
monochrome 75.3
house 72.1
water 63.4
white 60.4
sky 59.5
old 47

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 99.3%
Calm 64.9%
Sad 16.6%
Disgusted 5%
Fear 4.9%
Happy 3.2%
Surprised 2.7%
Angry 1.7%
Confused 1%

AWS Rekognition

Age 23-33
Gender Female, 54.9%
Calm 49.7%
Happy 46.1%
Confused 1.3%
Fear 1%
Sad 0.9%
Disgusted 0.4%
Angry 0.3%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.1%
Bird 94.6%

Categories

Text analysis

Amazon

39483-C

Google

39 4m 83-C YT3RA 2--AGO
39
4m
83-C
YT3RA
2--AGO