Human Generated Data

Title

Untitled (family portrait)

Date

c. 1950

People

Artist: Lainson Studios,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21888

Human Generated Data

Title

Untitled (family portrait)

People

Artist: Lainson Studios,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21888

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Architecture 100
Building 100
Furniture 100
Indoors 100
Living Room 100
Room 100
Lamp 100
Table Lamp 99.7
Formal Wear 99.2
Person 99.1
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.8
Adult 98.8
Male 98.8
Man 98.8
Clothing 98.7
People 98.7
Couch 98.4
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Face 96.8
Head 96.8
Photography 96.8
Portrait 96.8
Animal 96.3
Canine 96.3
Dog 96.3
Mammal 96.3
Pet 96.3
Interior Design 95.8
Dress 93.6
Dining Room 93.5
Dining Table 93.5
Table 93.5
Hound 93.4
Suit 88.9
Suit 81.4
Cabinet 80
Chair 79.7
Coat 79.3
Fireplace 73.6
Jacket 65.8
Footwear 59.1
Shoe 59.1
Chandelier 57.7
Puppy 57.5
Accessories 57.1
Tie 57.1
Blazer 56.9
Lampshade 56.7
Tuxedo 56.5
Fashion 56.5
Gown 56.5
Lighting 56.4
Armchair 55.9
Bridegroom 55.6
Wedding 55.6
Skirt 55.6
Shoe 55.5
Beagle 55.4
Door 55.1

Clarifai
created on 2018-08-23

people 99.6
group 97
administration 96.3
adult 95.4
indoors 95.3
leader 94.9
room 94.8
woman 93.5
chair 92.3
man 92.3
canine 88.4
actress 86.8
home 86.5
group together 86.5
furniture 86.1
sit 85.9
wear 84
outfit 82.9
family 80.3
five 74.9

Imagga
created on 2018-08-23

man 25.5
military uniform 24
uniform 23.2
people 22.8
person 22.8
male 21.5
happy 20.7
home 19.1
adult 18.5
blackboard 18.3
room 17.6
child 17.2
clothing 16
smile 15.7
teacher 15.1
interior 14.1
indoors 14
smiling 13.7
sitting 13.7
portrait 13.6
old 13.2
business 12.7
businessman 12.3
education 12.1
girls 11.8
house 11.7
family 11.6
covering 11.5
lady 11.3
classroom 11.3
fun 11.2
casual 11
school 10.9
holding 10.7
cheerful 10.6
attractive 10.5
consumer goods 10.3
senior 10.3
two 10.2
lifestyle 10.1
indoor 10
blond 9.8
pretty 9.8
couch 9.7
together 9.6
couple 9.6
boy 9.6
student 9.3
life 9.3
world 8.9
chair 8.9
computer 8.9
office 8.8
looking 8.8
work 8.8
women 8.7
happiness 8.6
laptop 8.5
fashion 8.3
board 8.1
childhood 8.1
handsome 8
class 7.7
wall 7.7
playing 7.3
businesswoman 7.3
group 7.2
color 7.2
cute 7.2
kid 7.1

Google
created on 2018-08-23

photograph 95.5
mammal 92.6
vertebrate 90.4
snapshot 81.8
dog like mammal 81.6
dog 80.8
human behavior 76.3
vintage clothing 74.3
table 73.4
furniture 71.5
picture frame 64.9
house 60.2
family 59.5
carnivoran 57.6
gentleman 56.9
chair 51.1
collection 51.1

Microsoft
created on 2018-08-23

wall 99.2
floor 98.4
indoor 97.5
old 54.3
posing 49.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Male, 98.8%
Calm 51.3%
Happy 20%
Confused 17.8%
Surprised 7.5%
Fear 6.2%
Angry 3.8%
Sad 3.2%
Disgusted 1.1%

AWS Rekognition

Age 36-44
Gender Female, 99.8%
Calm 53.8%
Surprised 23.1%
Happy 8.7%
Fear 8%
Confused 6.7%
Angry 5.5%
Sad 2.7%
Disgusted 1.2%

AWS Rekognition

Age 36-44
Gender Male, 100%
Happy 99.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Calm 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 11-19
Gender Female, 81.4%
Calm 93.5%
Surprised 6.3%
Fear 5.9%
Sad 3.3%
Confused 1.8%
Angry 1%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 47
Gender Male

Microsoft Cognitive Services

Age 18
Gender Female

Microsoft Cognitive Services

Age 18
Gender Male

Microsoft Cognitive Services

Age 49
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 98.9%
Male 98.9%
Man 98.9%
Dog 96.3%
Suit 88.9%
Shoe 59.1%

Text analysis

Amazon

PROOF
DENVER
LAINSON
LAINSON and
QUE
®
TINESE
GCILLS THE ® QUE TINESE TINEN
THE
and
TINEN
GCILLS
roose