Human Generated Data

Title

[Lyonel and Andreas Feininger]

Date

1907

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.675.304

Human Generated Data

Title

[Lyonel and Andreas Feininger]

People

Artist: Unidentified Artist,

Date

1907

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.675.304

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Person 99.5
Human 99.5
Play 99.5
Person 99
Face 98.6
Smile 98
Female 95.9
Clothing 95.8
Apparel 95.8
Outdoors 90
Kid 87.1
Child 87.1
Nature 86
Dress 85.4
Woman 80.8
Suit 80.3
Overcoat 80.3
Coat 80.3
Girl 80
Housing 79.4
Building 79.4
Portrait 78.6
Photography 78.6
Photo 78.6
Plant 76.5
Blonde 75.1
Teen 75.1
Person 74.5
Furniture 73.8
Baby 73.3
Boy 70.2
Tree 70.2
Indoors 68.7
Living Room 68.7
Room 68.7
People 68.2
Chair 68.1
Man 68
Animal 66.8
Bird 66.8
Yard 59.8
Leisure Activities 59.6
House 57.4
Laughing 56.2

Imagga
created on 2022-07-01

man 31.6
male 22.9
person 22.6
people 21.2
grandfather 19.6
adult 19
barbershop 18.5
shop 15.1
portrait 14.2
couple 13.9
child 13.9
black 13.8
happy 13.8
old 13.2
statue 12.4
senior 11.2
mercantile establishment 11.2
grandma 10.7
family 10.7
sculpture 10.6
mother 10.4
men 10.3
love 10.2
happiness 10.2
smiling 10.1
human 9.7
home 9.6
boy 9.6
sitting 9.4
face 9.2
dark 9.2
city 9.1
chair 9.1
danger 9.1
outdoors 9
world 8.9
businessman 8.8
lifestyle 8.7
culture 8.5
holding 8.2
protection 8.2
dirty 8.1
room 8
business 7.9
together 7.9
mask 7.7
casual 7.6
two 7.6
hand 7.6
place of business 7.6
life 7.5
vintage 7.4
parent 7.4
bow tie 7.4
dress 7.2
religion 7.2
romantic 7.1

Google
created on 2022-07-01

Black 89.6
Black-and-white 85.5
Style 84
Building 81.2
Monochrome photography 75.6
Monochrome 75.5
Art 71.5
Event 69.2
Fun 68.9
Vintage clothing 68.1
Room 67.3
Entertainment 63.3
House 62.3
Toddler 62.2
Visual arts 61.8
Child 60.6
Sitting 59.3

Microsoft
created on 2022-07-01

person 87
clothing 86.5
black and white 85.8
human face 85.6
baby 83.1
toddler 82.4
text 69.5
man 53.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 97.6%
Happy 52.8%
Calm 45.3%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.5%
Bird 66.8%

Captions

Microsoft
created on 2022-07-01

a man standing in a room 84.8%
a person posing for the camera 83.9%
a man posing for a photo 69.4%