Human Generated Data

Title

Untitled (woman cutting cake at banquet table)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13768

Human Generated Data

Title

Untitled (woman cutting cake at banquet table)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13768

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.9
Human 98.9
Person 98.5
Person 98.2
Person 98
Person 97.6
Person 97
Person 92.7
Person 91.9
Plant 83
Sitting 79.1
Person 77
Flower 72.8
Blossom 72.8
Funeral 71.8
Person 64.8
Flower Arrangement 62.8
Helmet 61.7
Clothing 61.7
Apparel 61.7
Flower Bouquet 57.3

Clarifai
created on 2023-10-26

people 99.9
group 98.3
furniture 97.7
room 97.2
woman 96.8
wedding 95.8
child 95.4
adult 94.3
indoors 93.7
man 93.4
many 92.9
group together 91.3
family 90.8
boy 87.1
street 85.8
two 85.1
leader 83.8
administration 83.1
art 82.2
several 81.3

Imagga
created on 2022-01-23

room 44
home 27.9
interior 27.4
furniture 26.9
sofa 26.8
couch 24.1
person 22.2
man 20.9
living 19.9
house 19.2
indoors 18.4
people 18.4
sitting 18
adult 16.9
luxury 15.4
modern 15.4
bed 15.3
lifestyle 15.2
table 14.9
pillow 14.8
chair 14.5
women 14.2
bedroom 14
inside 13.8
smiling 13.7
indoor 13.7
male 13.6
relaxation 13.4
child 12.8
two 12.7
teacher 12.5
family 12.4
lamp 12.3
floor 12.1
rest 12
light 12
happy 11.9
attractive 11.9
style 11.9
laptop 11.8
relax 11.8
leisure 11.6
night 11.5
hotel 11.4
together 11.4
couple 11.3
happiness 11
design 10.9
business 10.9
window 10.6
fun 10.5
wood 10
smile 10
pretty 9.8
decor 9.7
decoration 9.6
comfortable 9.5
love 9.5
togetherness 9.4
dark 9.2
groom 9
lady 8.9
romantic 8.9
group 8.9
looking 8.8
sleep 8.7
communication 8.4
fireplace 8.3
color 8.3
studio couch 8.3
cheerful 8.1
television 8
computer 8
architecture 7.8
lounge 7.8
education 7.8
men 7.7
elegant 7.7
lights 7.4
warm 7.3
new 7.3
dress 7.2
romance 7.1
work 7.1
businessman 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

wall 96.9
person 96
indoor 94.3
black and white 80
text 67.6
woman 64.6
white 60.6
clothing 55.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 100%
Happy 94.1%
Angry 2%
Fear 1.9%
Surprised 0.7%
Disgusted 0.4%
Calm 0.4%
Confused 0.3%
Sad 0.2%

AWS Rekognition

Age 14-22
Gender Female, 99.5%
Sad 89.9%
Calm 5.6%
Confused 1.1%
Disgusted 1%
Fear 1%
Happy 0.7%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 63%
Happy 28.6%
Disgusted 1.8%
Angry 1.8%
Calm 1.3%
Fear 1.3%
Confused 1.2%
Surprised 1%

AWS Rekognition

Age 41-49
Gender Male, 80.5%
Fear 98.8%
Calm 0.7%
Sad 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 25-35
Gender Female, 98.3%
Happy 90.8%
Surprised 7%
Fear 0.8%
Angry 0.4%
Disgusted 0.4%
Sad 0.2%
Confused 0.2%
Calm 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.7%
Surprised 51.3%
Sad 31.8%
Fear 6.5%
Angry 4.5%
Calm 2.4%
Confused 1.6%
Disgusted 1.5%
Happy 0.4%

AWS Rekognition

Age 49-57
Gender Female, 99.4%
Fear 35.7%
Sad 35%
Surprised 9.6%
Angry 7.9%
Calm 5.3%
Disgusted 2.3%
Happy 2.2%
Confused 2.1%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Fear 65.4%
Calm 15%
Surprised 7.8%
Sad 3.9%
Angry 3.8%
Disgusted 1.8%
Confused 1.4%
Happy 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 98.9%
Person 98.5%
Person 98.2%
Person 98%
Person 97.6%
Person 97%
Person 92.7%
Person 91.9%
Person 77%
Person 64.8%
Helmet 61.7%

Categories

Text analysis

Google

TE
TE