Human Generated Data

Title

Untitled (two young boys and woman sitting under Christmas tree)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5309

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two young boys and woman sitting under Christmas tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5309

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Tree 99.9
Plant 99.9
Human 99.7
Person 99.7
Person 99.2
Ornament 96.9
Person 94.9
Christmas Tree 94.2
Clothing 79.6
Apparel 79.6
Outdoors 75.6
Nature 71.9
Furniture 68
Table 68
Drink 64
Beverage 64
Face 60.9
Flower Arrangement 57.5
Flower 57.5
Blossom 57.5
Drawing 57.2
Art 57.2

Clarifai
created on 2023-10-26

people 99.7
group 98.4
man 96.8
adult 96
child 95.3
illustration 92.2
group together 92.1
woman 91.7
wear 91.4
several 91.1
recreation 90.2
war 88.3
many 87.4
two 87.4
leader 87.1
family 86.8
music 85.2
musician 84.3
three 81.6
boy 81.4

Imagga
created on 2022-01-22

kin 21.1
people 20.6
person 20.4
man 18.8
dress 16.3
portrait 16.2
black 15.6
adult 15.3
male 15
love 15
happiness 13.3
child 13.3
happy 13.2
grunge 12.8
bride 12.5
blackboard 11.7
vintage 10.7
teacher 10.7
face 10.6
retro 10.6
hospital 10.6
human 10.5
couple 10.4
sexy 10.4
women 10.3
wedding 10.1
girls 10
mother 9.8
fashion 9.8
family 9.8
old 9.7
lady 9.7
style 9.6
boy 9.6
hair 9.5
patient 9.3
silhouette 9.1
life 9
room 9
art 8.8
sport 8.8
smiling 8.7
lifestyle 8.7
antique 8.6
married 8.6
cute 8.6
men 8.6
wall 8.5
wife 8.5
design 8.4
groom 8.4
attractive 8.4
aged 8.1
cheerful 8.1
water 8
businessman 7.9
husband 7.8
negative 7.8
newspaper 7.8
sepia 7.8
pretty 7.7
casual 7.6
two 7.6
skin 7.6
head 7.6
power 7.6
fun 7.5
city 7.5
outdoors 7.5
light 7.3
clothing 7.3
dirty 7.2
wet 7.2
interior 7.1
creation 7.1
creative 7.1
parent 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.4
person 90.9
clothing 90.7
outdoor 90.3
black and white 87.7
man 83.4
drawing 81.4
posing 57.7
old 41.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 92.9%
Surprised 65.6%
Happy 25.6%
Fear 3.3%
Angry 1.7%
Calm 1.6%
Sad 1.3%
Disgusted 0.7%
Confused 0.4%

AWS Rekognition

Age 30-40
Gender Female, 60.1%
Happy 97.5%
Surprised 1%
Calm 0.4%
Angry 0.3%
Sad 0.3%
Fear 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 16-24
Gender Female, 57.6%
Happy 59.9%
Sad 21.1%
Angry 6.1%
Fear 5.4%
Surprised 3%
Calm 2.1%
Disgusted 1.7%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

6428
YT33AS
YT33AS 03003930
03003930

Google

6478
6478