Human Generated Data

Title

[Young man sitting on wood step]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.507.31

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Young man sitting on wood step]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 98
Person 98
Vehicle 78.8
Aircraft 78.8
Airplane 78.8
Transportation 78.8
Building 76.7
Apparel 76.6
Clothing 76.6
Face 67.4
Factory 56.4
Photo 55.1
Photography 55.1

Clarifai
created on 2019-11-19

people 99.7
vehicle 99.1
one 98.9
adult 98.5
aircraft 96.2
transportation system 96.1
man 95.9
watercraft 95.7
two 93.3
wear 91.8
war 89.9
woman 89.5
music 86.9
retro 86.1
group 83.4
monochrome 81
military 80.9
airplane 79.6
leader 78.7
administration 77.9

Imagga
created on 2019-11-19

grand piano 100
piano 100
keyboard instrument 92
stringed instrument 90.5
percussion instrument 87.9
musical instrument 62.1
car 45.1
vehicle 30.1
adult 26.5
person 25.2
automobile 24.9
driver 24.3
transportation 23.3
sitting 23.2
happy 23.2
smile 21.4
laptop 20
people 19.5
auto 18.2
driving 17.4
smiling 17.4
computer 16.8
man 16.8
business 15.2
drive 15.1
work 14.9
portrait 14.9
transport 14.6
male 14.2
working 14.1
pretty 14
attractive 14
office 12.9
face 12.1
communication 11.8
happiness 11.8
interior 11.5
indoors 11.4
travel 11.3
keyboard 11.3
professional 11
cute 10.8
technology 10.4
wheel 10.4
home 10.4
20s 10.1
student 10
road 9.9
hand 9.9
cheerful 9.8
outdoors 9.7
brunette 9.6
women 9.5
inside 9.2
equipment 8.9
hair 8.7
seat 8.5
holding 8.3
worker 8
businessman 7.9
boy 7.8
black 7.8
corporate 7.7
casual 7.6
desk 7.6
human 7.5
notebook 7.5
leisure 7.5
executive 7.4
safety 7.4
occupation 7.3
motor vehicle 7.2
lifestyle 7.2
looking 7.2
job 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

black and white 91.9
text 90.3
person 89.6
clothing 85.8
man 57.5
open 48.3
old 42.4

Face analysis

Amazon

AWS Rekognition

Age 38-56
Gender Male, 97.8%
Angry 7.2%
Fear 6.7%
Disgusted 40.3%
Surprised 15.2%
Happy 2.4%
Sad 1.5%
Confused 1.9%
Calm 24.8%

Feature analysis

Amazon

Person 98%
Airplane 78.8%

Captions

Microsoft

a person sitting in a box 43.7%
a person sitting in front of a box 37.9%
a person sitting on a box 34%

Text analysis

Amazon

SAN
SAN RAN
RAN
CROWWVE

Google

CROWZE SAN RANE పర
పర
SAN
RANE
CROWZE