Human Generated Data

Title

[Lyonel Feininger with Galka Scheyer at her house in Hollywood, California]

Date

1936

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.142.2

Human Generated Data

Title

[Lyonel Feininger with Galka Scheyer at her house in Hollywood, California]

People

Artist: Unidentified Artist,

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.142.2

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99
Human 99
Chair 92.1
Furniture 92.1
Person 90.7
Sitting 77
Clothing 76.8
Apparel 76.8
Plant 70.3
Overcoat 70
Suit 70
Coat 70
Electronics 67.7
Screen 67.7
Monitor 67.7
Display 67.7
LCD Screen 67.7
Meal 66.8
Food 66.8
Chair 65.6
Face 59.9
Restaurant 59.4
Cafeteria 59.4
Table 58.3
Shelf 56.9
Text 56
Finger 55.8

Clarifai
created on 2021-04-04

people 99.9
monochrome 98.6
furniture 96.9
adult 96.8
one 95.6
man 95
room 94.8
group 91.4
two 89.5
chair 89.5
art 88.3
street 87.9
music 87.2
woman 86.8
wear 85.3
group together 84.6
seat 83.3
war 82.9
piano 82
portrait 80.6

Imagga
created on 2021-04-04

musical instrument 32.6
percussion instrument 27.4
piano 25.5
grand piano 24.2
person 24
chair 23.7
people 21.8
stringed instrument 21.6
man 21.5
keyboard instrument 21.4
laptop 18.8
business 16.4
office 15.7
adult 15
sitting 14.6
computer 14.5
dark 14.2
television 14
silhouette 13.2
male 12.8
room 12.8
lifestyle 12.3
work 11.8
interior 11.5
device 11
attractive 10.5
group 10.5
sexy 10.4
table 10.4
black 10.2
happiness 10.2
teacher 10
suit 10
electronic instrument 9.9
fashion 9.8
working 9.7
technology 9.6
model 9.3
music 9.2
couple 8.7
glass 8.6
sensual 8.2
lady 8.1
light 8
water 8
night 8
worker 8
smiling 8
businessman 7.9
seat 7.9
men 7.7
modern 7.7
sky 7.7
communication 7.6
window 7.5
passion 7.5
portable computer 7.1
hair 7.1
love 7.1

Google
created on 2021-04-04

Table 94.1
Black 89.6
Chair 88.5
Black-and-white 85.1
Plant 79.6
Monochrome photography 76.4
Monochrome 74.2
Desk 70.6
Room 69.8
Sitting 69.1
Writing desk 67.1
Font 66.6
Event 66.1
Rectangle 64.8
Vintage clothing 64.8
Stock photography 64
Art 60.3
Houseplant 57.7
History 57.3
Photo caption 52.1

Microsoft
created on 2021-04-04

indoor 95.4
black and white 94.8
text 94.7
monochrome 77.3
clothing 68.5
person 67

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-61
Gender Female, 90%
Sad 96%
Fear 2.2%
Calm 1.1%
Confused 0.3%
Happy 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 92.1%

Captions