Human Generated Data

Title

Untitled (group of adults painting on easels near a lake)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10648

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of adults painting on easels near a lake)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10648

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.7
Person 99.7
Person 96.9
Person 96.1
Person 94.1
Person 92.6
Person 92
Person 91.2
Person 89.2
Person 87.8
Person 81.8
Building 80.2
Nature 77.5
Urban 77.2
Meal 76.3
Food 76.3
Person 76.2
Outdoors 75.4
Person 75.1
People 71.8
Workshop 69.3
Person 59
Countryside 56.9
Furniture 56.7
Indoors 55.9
Room 55.9
Crowd 55
Person 42

Clarifai
created on 2023-10-26

people 99.6
group 98.2
many 98.1
war 97.8
military 97.3
soldier 95.4
adult 95.3
man 93.7
vehicle 92.5
group together 91.8
crowd 91.2
wear 91
skirmish 90.8
uniform 89.2
administration 88.1
monochrome 83.4
woman 83.3
cavalry 83.2
army 82.9
force 82.3

Imagga
created on 2022-01-15

television 23.1
black 18.6
old 18.1
sky 16.6
vintage 15.7
landscape 15.6
industrial 15.4
telecommunication system 14.5
travel 13.4
grunge 12.8
building 12.5
architecture 12.5
structure 12.5
wheeled vehicle 11.7
dirty 11.7
history 11.6
drawing 11.1
snow 11.1
winter 10.2
silhouette 9.9
scenery 9.9
art 9.9
cold 9.5
construction 9.4
historical 9.4
man 9.4
industry 9.4
billboard 9.3
male 9.2
danger 9.1
factory 9
destruction 8.8
symbol 8.8
antique 8.7
grungy 8.5
monitor 8.5
park 8.5
power 8.4
smoke 8.4
house 8.4
texture 8.3
frame 8.3
city 8.3
tourism 8.2
pattern 8.2
protection 8.2
paint 8.1
wagon 8.1
transportation 8.1
vehicle 8
equipment 8
person 7.9
scenic 7.9
people 7.8
scene 7.8
ancient 7.8
chemical 7.7
gas 7.7
signboard 7.5
sign 7.5
track 7.5
environment 7.4
grain 7.4
retro 7.4
vacation 7.4
light 7.4
water 7.3
river 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.8
grave 92.5
cemetery 91.9
old 81.4
black 67.8
several 12.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 64.1%
Calm 85.6%
Sad 9.3%
Happy 3.5%
Disgusted 1.1%
Fear 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Calm 99.4%
Happy 0.5%
Surprised 0.1%
Disgusted 0%
Angry 0%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Female, 98.9%
Calm 75.6%
Happy 18.6%
Fear 1.6%
Surprised 1.3%
Sad 1.3%
Angry 0.9%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 33-41
Gender Male, 99.5%
Sad 54.4%
Happy 24.2%
Calm 10.3%
Fear 5.3%
Angry 2.7%
Confused 1.1%
Disgusted 1.1%
Surprised 0.9%

AWS Rekognition

Age 25-35
Gender Male, 94.4%
Calm 44.6%
Happy 32.7%
Confused 10.9%
Sad 5%
Fear 3.9%
Surprised 1.2%
Disgusted 1.1%
Angry 0.5%

AWS Rekognition

Age 26-36
Gender Male, 97.9%
Calm 87.1%
Sad 9.2%
Confused 2.1%
Happy 0.5%
Angry 0.3%
Fear 0.3%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 16-24
Gender Female, 99.2%
Calm 74.6%
Sad 10.5%
Fear 6.4%
Surprised 3.9%
Confused 2.1%
Happy 1%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 23-33
Gender Male, 77.7%
Calm 96.3%
Happy 1.2%
Sad 0.9%
Surprised 0.5%
Fear 0.5%
Disgusted 0.2%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 19-27
Gender Female, 65%
Calm 83.4%
Happy 6.7%
Sad 3.8%
Fear 2.3%
Confused 1.9%
Surprised 1%
Disgusted 0.6%
Angry 0.4%

AWS Rekognition

Age 26-36
Gender Female, 86%
Calm 97.5%
Sad 1.2%
Happy 0.9%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 93.4%
Sad 54.9%
Calm 38.8%
Happy 1.6%
Angry 1.1%
Disgusted 1%
Surprised 0.9%
Fear 0.9%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

34991
100
VI77A92

Google

34991
34991