Human Generated Data

Title

[Car and driver reflected in shop window, with mannequins]

Date

Unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.332.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Car and driver reflected in shop window, with mannequins]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Unknown

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.332.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 97.6
Human 97.6
Automobile 97
Vehicle 97
Car 97
Transportation 97
Car 95.5
Person 94.2
Interior Design 94.1
Indoors 94.1
Furniture 80.5
Car 75.6
Room 67
Car 65.9
Electronics 63.2
Screen 63.2
Person 62.4
Display 61.8
Monitor 61.8
Table 60.5
Chair 59.9
Car 50.8

Clarifai
created on 2019-05-29

people 99.5
adult 94.5
furniture 94
vehicle 93.6
group 92.4
indoors 92.1
room 90.6
man 88.2
group together 86.6
war 86.1
transportation system 86
military 85.7
administration 83.9
industry 83.6
aircraft 82.8
no person 80.8
one 80.1
chair 79.8
grinder 77.5
leader 74.7

Imagga
created on 2019-05-29

hall 23.2
negative 22.8
interior 21.2
architecture 21.1
room 20.9
city 19.9
building 19.2
film 19.1
chair 18.8
table 18.4
house 16.7
center 15.3
hospital 14.2
blackboard 13.6
photographic paper 13.3
modern 13.3
old 12.5
urban 12.2
scene 12.1
glass 12.1
counter 11.8
light 11.3
furniture 11.3
empty 11.3
design 11.2
water 10.7
decor 10.6
travel 10.6
metal 10.4
floor 10.2
sky 10.2
wall 9.7
classroom 9.2
sink 9
restaurant 8.9
home 8.9
photographic equipment 8.9
construction 8.5
industry 8.5
town 8.3
inside 8.3
vintage 8.1
steel 7.9
equipment 7.8
factory 7.8
dining 7.6
cityscape 7.6
bridge 7.6
kitchen 7.4
street 7.4
decoration 7.3
indoor 7.3
business 7.3
industrial 7.3
river 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

old 78
white 68.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-23
Gender Female, 55%
Disgusted 45.1%
Angry 45.3%
Surprised 45.2%
Happy 45.1%
Calm 52.4%
Confused 45.4%
Sad 46.6%

AWS Rekognition

Age 23-38
Gender Male, 52.1%
Happy 48.6%
Confused 45.3%
Disgusted 46.2%
Calm 47.4%
Sad 46.4%
Angry 45.7%
Surprised 45.4%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Car 97%

Categories

Imagga

interior objects 98.8%