Human Generated Data

Title

[Lux, Lyonel, Andreas and Laurence Feininger]

Date

1950's

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.463.29

Human Generated Data

Title

[Lux, Lyonel, Andreas and Laurence Feininger]

People

Artist: Unidentified Artist,

Date

1950's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 99.3
Human 99.3
Person 96.3
Furniture 87.8
Person 87
Person 76.5
Clinic 71.4
Room 66.7
Bedroom 66.7
Indoors 66.7
Bed 61.9
Text 57.6
Building 56.6
Housing 56.6
Pillow 55.8
Cushion 55.8

Clarifai
created on 2021-04-03

filmstrip 98.9
negative 98.8
movie 98.7
exposed 98.3
cinematography 97.9
monochrome 97.4
slide 96.3
man 96.3
people 95.5
sit 94.5
adult 92.5
group 91.7
retro 89.5
graphic design 88.8
illustration 88.2
business 87.5
woman 86.7
no person 86.3
screen 84.6
indoors 84.5

Imagga
created on 2021-04-03

furniture 35.9
room 35.6
interior 35.4
modern 33.7
home 32.8
house 30.1
design 29.9
3d 25.6
architecture 25.1
table 21.4
apartment 19.2
business 17.6
negative 17.4
living 17.1
frame 16.4
floor 15.8
lamp 15.4
decor 15
sofa 14.6
construction 14.5
vase 14.5
relax 14.3
film 14.3
galley 14.2
contemporary 14.1
wall 13.8
empty 13.7
indoor 13.7
residential 13.4
comfortable 13.4
panel 13.4
render 13
window 12.9
inside 12.9
domestic 12.7
style 12.6
luxury 12
technology 11.9
graphic 11.7
couch 11.6
comfort 11.6
vessel 11.5
light 11.4
photographic paper 11.1
set 11
clean 10.9
structure 10.8
element 10.8
office 10.6
indoors 10.5
new 10.5
rendering 10.5
icon 10.3
lifestyle 10.1
elegance 10.1
pillow 9.8
carpet 9.7
building 9.5
desk 9.1
sign 9
object 8.8
scene 8.7
vehicle 8.6
bedroom 8.6
glass 8.6
space 8.5
case 8.5
horizontal 8.4
color 8.3
wood 8.3
plant 8.2
craft 8.2
art 8.1
drawing 8.1
computer 8
decoration 8
blank 7.9
nobody 7.8
equipment 7.7
built 7.7
elegant 7.7
bed 7.7
estate 7.6
chair 7.6
relaxation 7.5
symbol 7.4
photographic equipment 7.4
reflection 7.3
data 7.3
furnishing 7.2

Google
created on 2021-04-03

Photograph 94.2
Font 83.3
Line 82
Rectangle 79.6
Shelf 73.7
Monochrome 68.9
Monochrome photography 68.6
Room 67.3
Shelving 66.3
Machine 64.5
Negative 59.1
History 58.5
Building 56.1
Photo caption 53.6
Art 53.6
Illustration 53
Photography 51.6

Microsoft
created on 2021-04-03

text 93.1
screenshot 75.5
person 64.3
house 59.3
black and white 58.1
human face 56.7
old 49.2
kitchen appliance 16.5

Face analysis

Amazon

Google

AWS Rekognition

Age 45-63
Gender Female, 59%
Calm 89.3%
Sad 8.1%
Happy 1.7%
Confused 0.5%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 13-23
Gender Female, 52%
Calm 82.7%
Happy 14.4%
Sad 2.2%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 10-20
Gender Female, 62.5%
Sad 53.7%
Calm 25.5%
Happy 16.1%
Fear 2.1%
Confused 1.1%
Angry 1%
Disgusted 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

an old photo of a person 66.2%
old photo of a person 65.5%
an old photo of a person 55.4%