Human Generated Data

Title

[Tomas Feininger reading]

Date

late 1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.524.32

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Tomas Feininger reading]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.524.32

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99
Person 99
Furniture 97.3
Table 95.4
Desk 95.4
Sitting 74.7
Electronics 73.6
Display 59.9
Screen 59.9
Monitor 59.9
Bed 59.1
Computer 59
Text 58

Clarifai
created on 2019-11-19

people 99.3
adult 98.7
furniture 96.3
room 96.2
man 95
administration 94.6
one 92.7
chair 92.4
leader 91.3
home 90.3
two 88.8
group 85.6
indoors 84.9
sit 84.3
seat 82.7
woman 81.8
group together 81.5
wear 79.6
war 75.8
monochrome 74

Imagga
created on 2019-11-19

office 44.1
computer 40.4
man 39.7
laptop 38.1
person 35.7
people 35.2
desk 34.2
room 33.4
male 33.4
business 31
sitting 31
indoors 30.8
adult 30.2
home 28.7
table 26.9
senior 25.3
working 23.9
businessman 23.9
work 22.9
businesswoman 21.8
smiling 21.7
happy 21.3
lifestyle 19.5
mature 18.6
meeting 17.9
smile 17.8
portrait 16.8
classroom 16.7
corporate 16.3
worker 16.2
businesspeople 16.1
couple 15.7
technology 15.6
men 15.5
job 15.1
indoor 14.6
casual 14.4
elderly 14.4
team 14.3
talking 14.3
group 13.7
retired 13.6
professional 13.6
retirement 13.5
communication 13.4
executive 13.3
together 13.2
keyboard 13.2
looking 12.8
monitor 12.6
education 12.1
manager 12.1
face 12.1
phone 12
attractive 11.9
conference 11.7
workplace 11.4
cheerful 11.4
newspaper 11.4
modern 11.2
paper 11
happiness 11
house 10.9
document 10.8
director 10.7
color 10.6
old 10.5
women 10.3
successful 10.1
alone 10.1
hospital 9.9
notebook 9.8
colleagues 9.7
one 9.7
furniture 9.7
30s 9.6
expression 9.4
screen 9.4
horizontal 9.2
clinic 9
center 8.9
older 8.7
clothing 8.7
jacket 8.7
hand 8.4
teamwork 8.4
coffee 8.3
pensioner 8.3
nurse 8.3
holding 8.3
aged 8.2
cup 8.1
engineer 8
product 8
interior 8
debate 7.9
boardroom 7.9
associates 7.9
60s 7.8
businessperson 7.8
discussion 7.8
mid adult 7.7
corporation 7.7
studying 7.7
husband 7.6
reading 7.6
finance 7.6
wife 7.6
career 7.6
pen 7.6
contemporary 7.5
camera 7.4
suit 7.4
occupation 7.3
20s 7.3
success 7.2
blond 7.2
bright 7.2
to 7.1
patient 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 95.2
person 86.4
man 76.5
human face 76.1
clothing 75.9
computer 67.9
table 57.2
laptop 53
desk 12.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Female, 88.2%
Happy 4.4%
Confused 1.6%
Sad 2.8%
Calm 78.7%
Angry 8.6%
Fear 0.9%
Surprised 2.2%
Disgusted 0.7%

Feature analysis

Amazon

Person 99%
Bed 59.1%

Categories

Imagga

interior objects 97.8%

Captions

Text analysis

Amazon

BOSTON

Google

BOSTON
BOSTON