Human Generated Data

Title

Untitled (two portraits of mother and child)

Date

c. 1930

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1944

Human Generated Data

Title

Untitled (two portraits of mother and child)

People

Artist: John Deusing, American active 1940s

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1944

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 95.1
Person 93.9
Person 93.8
Person 90.8
Art 90.6
Person 90.3
Drawing 83.5
Painting 81.8
Outdoors 76.9
Face 74.5
Nature 73.7
Sketch 72.1
Portrait 63.4
Photography 63.4
Photo 63.4
People 55.8
Person 43.5

Clarifai
created on 2023-10-25

people 99.3
child 98.4
art 97.1
portrait 95.9
baby 95.4
sepia 93.5
girl 93.1
man 93
monochrome 92.7
son 90.7
wear 90.4
two 87.5
painting 86.3
group 85.9
paper 85.7
adult 85.2
collage 83.1
wedding 83
woman 82.4
interaction 81.8

Imagga
created on 2021-12-14

negative 100
film 86.2
photographic paper 66.6
photographic equipment 44.4
snow 33.6
ice 33.3
cold 31.8
winter 27.2
water 23.3
cool 19.5
frost 19.2
frozen 18.1
crystal 16.2
holiday 15
freeze 13.4
season 13.2
clear 13.1
weather 12.5
sky 12.1
splash 11.3
celebration 11.2
man 10.8
transparent 10.7
solid 10.6
bright 10
drop 10
religion 9.9
backgrounds 9.7
liquid 9.6
aqua 9.5
color 9.5
glass 9.4
groom 9.4
motion 9.4
close 9.1
year 9.1
windshield 9.1
environment 9
texture 9
people 8.9
screen 8.9
new 8.9
splashing 8.7
shiny 8.7
decoration 8.7
falling 8.7
snowflake 8.7
person 8.6
fresh 8.5
rain 8.5
bubble 8.5
clouds 8.4
wallpaper 8.4
outdoor 8.4
festive 8.3
ocean 8.3
outdoors 8.2
light 8
seasonal 7.9
ball 7.9
day 7.8
travel 7.7
stream 7.6
happy 7.5
drink 7.5
flowing 7.5
shape 7.5
purity 7.4
air 7.4
shine 7.4
smooth 7.3
snowman 7.2
star 7.2
wet 7.2
love 7.1
male 7.1
surface 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 89.8
snow 87.9
sketch 87.1
drawing 84.3
black and white 71.6
white 67.9
posing 43.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Female, 82.5%
Calm 74.5%
Sad 14.5%
Fear 4.1%
Surprised 3.5%
Happy 1.3%
Confused 1.2%
Angry 0.5%
Disgusted 0.3%

AWS Rekognition

Age 37-55
Gender Female, 91.9%
Calm 78.9%
Happy 15.3%
Sad 3.5%
Confused 0.7%
Angry 0.5%
Fear 0.5%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 28-44
Gender Female, 97.6%
Calm 56.5%
Happy 35.7%
Sad 2.3%
Confused 2%
Surprised 1.6%
Angry 0.7%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 21-33
Gender Female, 90.4%
Happy 99.6%
Calm 0.3%
Sad 0%
Angry 0%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%

Feature analysis

Amazon

Person 93.9%
Painting 81.8%

Categories

Imagga

paintings art 99.8%