Human Generated Data

Title

Untitled (two photographs: boy leaning on tree; woman cutting cake)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12983

Human Generated Data

Title

Untitled (two photographs: boy leaning on tree; woman cutting cake)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12983

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.7
Human 99.7
Food 99.6
Cake 99.6
Dessert 99.6
Person 98.9
Creme 97.8
Icing 97.8
Cream 97.8
Interior Design 97.4
Indoors 97.4
Person 96.8
Apparel 92.4
Clothing 92.4
Flower 79.5
Blossom 79.5
Plant 79.5
Wedding Cake 75.6
Room 73.6
Gown 70.3
Fashion 70.3
Leisure Activities 66.9
Person 64.8
Robe 63.5
Sweets 61.7
Confectionery 61.7
Wedding 61
Person 57.6
Wedding Gown 56.4
Display 55.2
Monitor 55.2
Electronics 55.2
Screen 55.2

Clarifai
created on 2019-11-16

people 99.9
furniture 98.3
adult 98.2
woman 97.5
one 96.7
two 95.9
portrait 95.7
room 95.7
man 95.5
sit 95
group 93.8
indoors 91.7
music 91.7
chair 90.6
seat 90
actress 89.3
monochrome 86.8
dress 86.6
wear 85.5
musician 85.1

Imagga
created on 2019-11-16

windowsill 24.5
window 24.5
television 23.4
sill 19.9
black 19.3
person 15.7
structural member 14.9
people 14.5
dark 14.2
blackboard 14
telecommunication system 13.4
fashion 12.8
adult 12.4
man 12.1
support 11.8
old 11.1
portrait 11
style 10.4
light 10
silhouette 9.9
vintage 9.9
room 9.8
lady 9.7
sexy 9.6
passion 9.4
elegance 9.2
office 9.1
dress 9
furniture 9
religion 9
bride 8.6
happiness 8.6
art 8.6
elegant 8.6
male 8.5
clothing 8.1
night 8
body 8
hair 7.9
sitting 7.7
wall 7.7
device 7.7
happy 7.5
one 7.5
business 7.3
looking 7.2
home 7.2
smile 7.1
women 7.1
love 7.1
face 7.1
interior 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.5
indoor 92
person 89.6
clothing 89.4
wedding dress 83.4
black and white 79.5
woman 76.4
flower 74.9
birthday cake 69.5
bride 67.4
old 62.6
candle 62.4
wedding cake 60.4
blackboard 59
human face 50.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 53.4%
Fear 45%
Calm 45%
Angry 45%
Happy 45%
Confused 45%
Disgusted 45%
Sad 55%
Surprised 45%

AWS Rekognition

Age 32-48
Gender Male, 50.7%
Calm 53.2%
Fear 45.1%
Sad 46%
Happy 45.1%
Disgusted 45.1%
Surprised 45.1%
Confused 45.2%
Angry 45.1%

AWS Rekognition

Age 30-46
Gender Male, 54.6%
Happy 45.1%
Angry 45.1%
Disgusted 45%
Sad 46.3%
Surprised 45%
Calm 53.4%
Fear 45%
Confused 45%

AWS Rekognition

Age 2-8
Gender Female, 54.5%
Angry 45.2%
Calm 48.9%
Confused 45.1%
Disgusted 45.1%
Surprised 45%
Happy 45%
Sad 50.5%
Fear 45.1%

AWS Rekognition

Age 22-34
Gender Male, 53.6%
Fear 45.3%
Angry 45.2%
Sad 53.5%
Happy 45%
Calm 45.9%
Confused 45%
Surprised 45.1%
Disgusted 45%

AWS Rekognition

Age 17-29
Gender Female, 50.4%
Sad 49.7%
Fear 50.2%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Calm 49.5%
Happy 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 99.7%
Wedding Cake 75.6%

Categories

Imagga

interior objects 59.7%
paintings art 35.2%
food drinks 3.8%

Text analysis

Amazon

G1233.5
G1233.5 01R
865151
01R

Google

eG 12 335 R6
eG
12
335
R6