Human Generated Data

Title

[New York: Julia Feininger]

Date

1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.679.24

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[New York: Julia Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.679.24

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-02-19

Person 99.3
Human 99.3
Person 99.3
Head 85
Indoors 83.9
Room 81.3
Text 76.4
Interior Design 74.8
Court 65.8
Portrait 64.1
Photography 64.1
Face 64.1
Photo 64.1
Judge 63.4
Monitor 58.3
Electronics 58.3
Screen 58.3
Display 58.3
Crowd 56

Clarifai
created on 2023-10-28

movie 98.8
people 98.1
negative 98
adult 96.5
man 93.2
one 93.1
portrait 93
business 89.7
technology 87.5
actor 86.4
no person 86.1
connection 84.7
desktop 83.2
room 81.9
filmstrip 81.6
wear 81.4
picture frame 81
screen 80.8
cinematography 79
indoors 77.5

Imagga
created on 2022-02-19

washer 87
white goods 69.9
home appliance 53.1
appliance 42
cassette tape 36.5
magnetic tape 29.2
device 26.1
equipment 23.6
hole 23.1
memory device 22
old 18.8
durables 17.8
retro 16.4
black 16.2
cassette 15.7
technology 14.8
film 14.2
vintage 14.1
entertainment 13.8
sound 12.2
digital 12.1
light 12
music 11.7
texture 11.1
industry 11.1
grunge 11.1
close 10.8
silver 10.6
container 9.9
computer 9.6
electrical 9.6
home 9.6
audio 9.6
stove 9.5
object 9.5
play 9.5
negative 9.4
camera 9.4
dark 9.2
plastic 9.2
industrial 9.1
design 9
metal 8.8
office 8.8
noise 8.8
movie 8.7
tape 8.7
empty 8.6
business 8.5
electric 8.4
electronic 8.4
data 8.2
dirty 8.1
steel 8
paper 7.8
objects 7.8
slide 7.8
nobody 7.8
video 7.7
media 7.6
studio 7.6
power 7.6
frame 7.5
plug 7.4
border 7.2
cut 7.2
art 7.2
information 7.1

Microsoft
created on 2022-02-19

indoor 95.9
poster 91.4
screenshot 77.6
text 65.9
book 55.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Female, 99.9%
Happy 88.2%
Calm 3.7%
Sad 3.5%
Confused 2.2%
Surprised 0.9%
Disgusted 0.6%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Female, 99.9%
Happy 97.2%
Surprised 0.9%
Sad 0.8%
Calm 0.3%
Disgusted 0.2%
Fear 0.2%
Angry 0.2%
Confused 0.1%

Microsoft Cognitive Services

Age 56
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.3%

Categories