Human Generated Data

Title

[Julia, Lyonel, and Laurence(?) Feininger on on porch]

Date

1940s-1950s

People

Artist: Unidentified American Framemaker,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.73

Human Generated Data

Title

[Julia, Lyonel, and Laurence(?) Feininger on on porch]

People

Artist: Unidentified American Framemaker,

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.73

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 100
Food 99.2
Meal 99.2
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.9
Cafeteria 97.7
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Cafe 93.1
Face 92.3
Head 92.3
Dish 90.2
Food Court 89.1
Saucer 88
Beverage 83.1
Coffee 83.1
Coffee Cup 83.1
Cup 80.5
Cutlery 79.2
Coffee Cup 75.4
Diner 71.1
Photography 57.7
Spoon 57.5
Glass 57.4
Portrait 57.2
Pottery 57.1
Housing 57.1
Couch 56.6
Tabletop 55.9
Living Room 55.9

Clarifai
created on 2023-10-15

people 99.7
monochrome 98.9
man 95.9
adult 95.5
woman 93.9
one 93.9
window 92.8
portrait 92
street 90.7
indoors 90.1
two 89.3
art 87.9
child 83.8
girl 82.8
sit 82.6
room 81.4
sepia 81
light 80
chair 80
vintage 78.9

Imagga
created on 2019-01-31

man 23.6
person 19.1
people 16.7
male 16.4
black 15.6
adult 14.3
device 13.5
dress 12.6
portrait 12.3
love 11.8
world 11.6
human 11.2
passenger 10.8
lifestyle 10.1
face 9.9
fashion 9.8
old 9.8
couple 9.6
hair 9.5
happiness 9.4
light 9.4
groom 9.3
call 9.1
hand 9.1
vintage 9.1
industrial 9.1
body 8.8
bride 8.6
men 8.6
art 8.5
wedding 8.3
alone 8.2
dirty 8.1
wall 8
home 8
interior 8
window 7.9
hands 7.8
model 7.8
attractive 7.7
youth 7.7
house 7.5
dark 7.5
emotion 7.4
life 7.3
travel 7

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

window 91.6
old 41.4
person 41.4
black and white 20.3
monochrome 3.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 99.9%
Sad 92.3%
Calm 53.5%
Surprised 6.4%
Fear 6%
Angry 0.6%
Confused 0.6%
Happy 0.5%
Disgusted 0.3%

AWS Rekognition

Age 28-38
Gender Male, 97.5%
Calm 76.3%
Sad 7.2%
Happy 7%
Surprised 6.9%
Fear 6.2%
Confused 3.4%
Disgusted 1.7%
Angry 0.8%

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Coffee Cup 83.1%

Categories

Imagga

interior objects 100%