Human Generated Data

Title

Untitled (art class painting on waterfront, Sarasota, Florida)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11682

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (art class painting on waterfront, Sarasota, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11682

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.3
Human 99.3
Person 99.1
Person 98.4
Person 95.7
Person 94.9
Person 92.5
Person 90.8
Outdoors 88.1
Nature 84.7
Person 84.4
Person 82.5
Person 78.1
Person 77
Face 70.6
Person 67.6
Workshop 66.2
People 66.1
Person 64.9
Person 64.4
Chair 60.4
Furniture 60.4
Building 60.2
Crowd 59.5
Tree 57.8
Plant 57.8
Countryside 57.4
Person 55.1

Clarifai
created on 2023-10-26

people 99.8
war 98.9
military 98.6
many 98.3
group 98.2
adult 97.7
soldier 97.2
group together 95.3
man 94.9
skirmish 94.5
woman 92.9
administration 92.5
wear 91.3
tent 90.8
campsite 88.6
vehicle 87
army 85.6
crowd 83.7
weapon 83.6
art 83.3

Imagga
created on 2022-01-15

television 36.1
telecommunication system 20.3
monitor 18.9
black 18
sky 17.9
old 17.4
landscape 16.4
grunge 15.3
negative 14.2
travel 14.1
building 13.8
structure 13.4
film 13.4
architecture 13.3
electronic equipment 13.1
construction 12.8
city 12.5
river 12.5
vintage 12.4
water 12
park 11.9
dirty 11.7
scenery 11.7
equipment 11.2
wreckage 11.1
mountains 11.1
winter 11.1
industrial 10.9
history 10.7
snow 10.5
scenic 10.5
forest 10.4
smoke 10.2
part 10.2
house 10
stage 9.8
trees 9.8
destruction 9.8
factory 9.7
fog 9.7
environment 9
broadcasting 9
outdoors 9
rock 8.7
skyline 8.5
industry 8.5
historical 8.5
frame 8.4
outdoor 8.4
sax 8.4
power 8.4
texture 8.3
tourism 8.2
pattern 8.2
retro 8.2
platform 8.1
vehicle 8.1
transportation 8.1
antique 7.8
cold 7.7
wheeled vehicle 7.7
drawing 7.5
dark 7.5
exterior 7.4
natural 7.4
new 7.3
danger 7.3
paint 7.2
border 7.2
country 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.9
book 92.5
grave 87.7
cemetery 86.5
old 83.6
drawing 53.5
person 52

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Calm 99.1%
Sad 0.4%
Happy 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 96.3%
Calm 96.5%
Happy 1.8%
Sad 0.7%
Surprised 0.5%
Confused 0.3%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 78.2%
Calm 99.4%
Sad 0.4%
Happy 0.1%
Disgusted 0.1%
Angry 0%
Surprised 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 38-46
Gender Male, 97.1%
Happy 86.3%
Sad 6.9%
Calm 3.5%
Surprised 0.9%
Disgusted 0.8%
Angry 0.6%
Fear 0.5%
Confused 0.5%

AWS Rekognition

Age 23-31
Gender Female, 76.1%
Calm 54.1%
Happy 38.7%
Sad 4.8%
Fear 1.2%
Confused 0.4%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 9-17
Gender Female, 86.1%
Calm 59.7%
Happy 26.8%
Sad 6.3%
Fear 3.5%
Disgusted 1.2%
Confused 1%
Surprised 1%
Angry 0.6%

AWS Rekognition

Age 25-35
Gender Male, 99.3%
Calm 99.9%
Happy 0%
Sad 0%
Confused 0%
Surprised 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 97.4%
Calm 91.4%
Confused 3.5%
Happy 2.4%
Sad 1.2%
Surprised 0.4%
Fear 0.4%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 28-38
Gender Female, 93%
Calm 52.6%
Happy 35.4%
Sad 4.7%
Surprised 2.7%
Disgusted 1.6%
Angry 1.2%
Confused 1.2%
Fear 0.7%

Feature analysis

Amazon

Person 99.3%
Chair 60.4%

Categories

Text analysis

Amazon

34988
100
RODOK-S.VEETA

Google

34.9 88
34.9
88