Human Generated Data

Title

[Andreas Feininger reading]

Date

1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.679.57

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas Feininger reading]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.3
Human 99.3
Person 99.2
Home Decor 96.7
Furniture 94.7
Text 73.2
Couch 72.9
Room 72.9
Indoors 72.9
Table 69.2
Bed 67.4
Cushion 63.8
Bedroom 61.2
Advertisement 59
Collage 59
Screen 58.1
Electronics 58.1
Poster 57.1
Dining Table 56.6

Imagga
created on 2022-02-25

hole 36
negative 32.9
film 29.9
laundry 20
old 17.4
photographic paper 15.7
architecture 13.6
iron 13.4
travel 13.4
city 12.5
black 12
design 11.8
tourism 11.5
retro 11.5
holiday 10.7
photographic equipment 10.5
appliance 10.4
building 10.4
home appliance 10.2
sky 10.2
light 10
house 9.2
vintage 9.1
texture 9
device 9
technology 8.9
urban 8.7
construction 8.5
stone 8.5
industry 8.5
grunge 8.5
modern 8.4
vacation 8.2
room 8.1
water 8
home 8
art 7.9
movie 7.7
dark 7.5
landscape 7.4
table 7.3
business 7.3
industrial 7.3
border 7.2
decoration 7.2
scenery 7.2

Google
created on 2022-02-25

Photograph 94.3
White 92.2
Organ 90.8
Product 90.7
Black 89.5
Style 83.8
Line 82.4
Black-and-white 82
Font 80.3
Photographic film 80
Adaptation 79.2
Tints and shades 77.2
Snapshot 74.3
Rectangle 72.9
Monochrome photography 69.7
Room 68.3
Monochrome 68.1
Pattern 67.6
Negative 65.5
Art 65.3

Microsoft
created on 2022-02-25

person 94.4
indoor 94.2
text 68.9

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 99.1%
Calm 53%
Happy 33%
Sad 11.1%
Confused 1.1%
Disgusted 0.6%
Fear 0.5%
Angry 0.4%
Surprised 0.2%

AWS Rekognition

Age 42-50
Gender Male, 68.9%
Calm 92.8%
Sad 3.8%
Happy 1.9%
Confused 0.8%
Surprised 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Poster 57.1%

Captions

Microsoft

a man sitting at a train station 42.5%
a group of people sitting at a train station 37.6%
a man sitting on a train 30.1%

Text analysis

Amazon

MIC
PANATOMIC

Google

MIC PANATOMIC
MIC
PANATOMIC