Human Generated Data

Title

Untitled (painting class on beach, Sarasota, Florida)

Date

1952, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.276

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (painting class on beach, Sarasota, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.276

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-05-27

Person 99.6
Human 99.6
Person 99.5
Person 99.5
Person 99.3
Person 99.1
Person 99.1
Person 98.8
Person 98.8
Person 98.8
Person 98.3
Person 98.1
Person 97.6
Chair 94.6
Furniture 94.6
Meal 94.3
Food 94.3
Person 86.5
Outdoors 80.4
People 79.7
Clothing 76.7
Apparel 76.7
Person 75.6
Crowd 73
Military 72.3
Leisure Activities 71
Sitting 69.8
Person 68.1
Picnic 66.9
Vacation 66.9
Tree 65.5
Plant 65.5
Military Uniform 64.8
Female 62.1
Army 61.8
Armored 61.8
Water 59
Troop 57.9
Musician 57.2
Musical Instrument 57.2
Soldier 55.4
Person 43.6

Clarifai
created on 2023-10-30

people 100
group together 99.6
group 99.6
adult 99
many 98.8
child 97.8
war 97.8
military 97.4
man 97.3
campsite 96.9
soldier 96.4
woman 95.7
wear 95.4
administration 94.6
several 93.8
skirmish 92.7
tent 92.7
home 91.7
boy 90.8
recreation 89.8

Imagga
created on 2022-05-27

barrow 25.4
man 23.5
handcart 19.5
vehicle 18.9
shovel 18.1
male 17.8
seller 17.4
old 17.4
wheeled vehicle 16.8
person 15.3
sky 15.3
people 15.1
outdoor 13
outdoors 12.8
military 12.5
architecture 11.7
stone 11.5
soldier 10.7
uniform 10.6
travel 10.6
building 10.5
men 10.3
tool 10.3
two 10.2
protection 10
danger 10
history 9.8
hand tool 9.8
war 9.6
tree 9.6
weapon 9.5
clothing 9.2
vintage 9.1
adult 9.1
transportation 9
working 8.8
ancient 8.6
day 8.6
statue 8.4
summer 8.4
city 8.3
sport 8.2
vacation 8.2
dirty 8.1
activity 8.1
country 7.9
sand 7.9
rock 7.8
destruction 7.8
play 7.8
mask 7.7
equipment 7.6
fun 7.5
street 7.4
historic 7.3
gun 7.2
recreation 7.2
game 7.1
trees 7.1
conveyance 7.1

Google
created on 2022-05-27

Microsoft
created on 2022-05-27

outdoor 99.9
sky 99.8
person 98.5
tree 98.5
clothing 93
group 83.8
man 82.7
old 71.1
people 67.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 99.7%
Happy 95.1%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Calm 2.2%
Disgusted 1.1%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 63-73
Gender Male, 99.9%
Calm 72.6%
Angry 13.5%
Surprised 7.2%
Fear 6.1%
Sad 4.5%
Disgusted 3.5%
Happy 1.5%
Confused 1.1%

AWS Rekognition

Age 50-58
Gender Male, 50.8%
Happy 88.6%
Calm 9.8%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 34-42
Gender Female, 97.7%
Calm 65.3%
Fear 15.4%
Surprised 10.4%
Angry 3.6%
Happy 3%
Sad 3%
Disgusted 2%
Confused 1.4%

AWS Rekognition

Age 48-54
Gender Male, 96.2%
Calm 90.1%
Surprised 6.6%
Fear 6.1%
Sad 4.8%
Angry 1.2%
Confused 0.6%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 69.3%
Confused 12.5%
Surprised 7.8%
Fear 6.8%
Happy 4.9%
Sad 4.1%
Angry 2%
Disgusted 1.5%

AWS Rekognition

Age 36-44
Gender Female, 100%
Happy 31.8%
Calm 28.1%
Angry 18.2%
Disgusted 11.5%
Surprised 9.9%
Fear 6.3%
Sad 2.5%
Confused 2.3%

AWS Rekognition

Age 54-64
Gender Male, 92%
Sad 98.1%
Calm 13.5%
Angry 9.4%
Fear 7.9%
Confused 7.2%
Surprised 6.9%
Happy 5.4%
Disgusted 4.1%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 82.8%
Surprised 6.9%
Fear 6.4%
Angry 4.7%
Sad 4.4%
Happy 2.1%
Disgusted 1.5%
Confused 1.1%

AWS Rekognition

Age 34-42
Gender Male, 98.3%
Disgusted 74.8%
Sad 12.2%
Surprised 6.5%
Fear 6.2%
Angry 5.8%
Calm 3.1%
Happy 1.4%
Confused 0.6%

AWS Rekognition

Age 41-49
Gender Female, 99.9%
Disgusted 43.5%
Calm 24%
Confused 17.3%
Surprised 8.9%
Fear 6.5%
Sad 4%
Happy 3.2%
Angry 1.5%

AWS Rekognition

Age 50-58
Gender Female, 99.9%
Sad 60.5%
Happy 33.1%
Calm 31.3%
Surprised 7.1%
Fear 6.6%
Angry 0.9%
Confused 0.8%
Disgusted 0.7%

AWS Rekognition

Age 45-51
Gender Female, 100%
Happy 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0%
Confused 0%
Calm 0%
Angry 0%

AWS Rekognition

Age 37-45
Gender Female, 99.3%
Sad 51.8%
Calm 44.1%
Confused 14.4%
Surprised 8.1%
Fear 6.9%
Happy 4.2%
Angry 2.2%
Disgusted 2.1%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Surprised 73.7%
Happy 18.5%
Sad 11.6%
Calm 10.5%
Fear 7.2%
Disgusted 4.4%
Angry 4%
Confused 3.4%

AWS Rekognition

Age 22-30
Gender Male, 98.8%
Calm 88.8%
Happy 6.9%
Surprised 6.5%
Fear 5.9%
Sad 2.6%
Confused 1.9%
Disgusted 0.3%
Angry 0.2%

Microsoft Cognitive Services

Age 49
Gender Female

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 94.6%

Categories

Text analysis

Google

NETRON SERIBA
NETRON
SERIBA