Human Generated Data

Title

Untitled (family in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17155

Human Generated Data

Title

Untitled (family in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17155

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.9
Apparel 99.9
Person 99.2
Human 99.2
Person 98.9
Person 97.2
Person 97
Person 95.4
Female 94
Person 93.4
Dress 91.3
Woman 84.6
Gown 80.2
Fashion 80.2
Robe 78.5
Suit 76.7
Overcoat 76.7
Coat 76.7
Person 72.7
Sleeve 69
Skirt 66.6
Portrait 66.3
Face 66.3
Photography 66.3
Photo 66.3
Wedding 65.3
Girl 63.6
Wedding Gown 63.1
Indoors 62.7
Flower 62.2
Plant 62.2
Blossom 62.2
People 61.5
Evening Dress 59.9
Room 58.4
Shoe 57.6
Footwear 57.6
Drawing 56.9
Art 56.9
Living Room 56.9
Shorts 56.8
Shoe 51.1

Clarifai
created on 2023-10-28

people 99.8
group 99.2
man 98.9
woman 98.3
adult 96.7
monochrome 93.9
family 91.5
actor 90.5
child 89.5
education 89.4
indoors 85.4
medical practitioner 84
group together 83.3
leader 82
teacher 81.3
wear 79.7
three 78.8
many 73.8
several 73.5
room 73.5

Imagga
created on 2022-02-26

businessman 50.4
business 45
man 43.1
male 42.6
people 41.3
corporate 38.7
person 36.2
team 35
office 33.8
professional 33.3
men 32.7
group 32.3
adult 31.9
teamwork 30.6
happy 28.9
meeting 28.3
businesswoman 28.2
businesspeople 27.6
work 27.5
executive 26.7
job 26.6
nurse 26.2
colleagues 25.3
suit 23.1
worker 22.7
smiling 21.7
success 20.9
manager 20.5
portrait 19.4
handsome 18.7
together 18.4
company 17.7
successful 17.4
confident 17.3
boss 17.2
women 16.6
cooperation 16.5
smile 16.4
modern 16.1
communication 16
casual 15.3
looking 15.2
career 15.2
employee 15.1
associates 14.8
attractive 14.7
working 14.2
businessmen 13.7
partner 13.5
desk 13.2
workers 12.6
staff 12.6
partnership 12.5
coat 12.3
standing 12.2
coworkers 11.8
room 11.6
clothing 11.6
leadership 11.5
ethnic 11.4
couple 11.3
entrepreneur 11.2
lab coat 11.2
jacket 11.1
occupation 11
20s 11
laptop 10.9
30s 10.6
diversity 10.6
mature 10.2
indoor 10.1
human 9.8
discussion 9.7
indoors 9.7
corporation 9.7
leader 9.6
black 9.6
formal 9.6
tie 9.5
life 9.5
doctor 9.4
outfit 9.4
computer 8.9
collaboration 8.9
discussing 8.9
medical 8.8
home 8.8
lifestyle 8.7
happiness 8.6
patient 8.6
hospital 8.5
guy 8.4
presentation 8.4
fashion 8.3
silhouette 8.3
new 8.1
interaction 7.9
day 7.9
conference 7.8
face 7.8
40s 7.8
table 7.8
partners 7.8
pretty 7.7
profession 7.7
health 7.7
two 7.6
adults 7.6
smart 7.5
camera 7.4
light 7.4
secretary 7.3
friendly 7.3
cheerful 7.3
color 7.2
bright 7.2
building 7.2
uniform 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.7
clothing 96.4
footwear 91.6
person 89.8
standing 82.3
posing 78.3
man 77.4
woman 77
drawing 68.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Happy 81%
Sad 8.9%
Disgusted 3.4%
Confused 1.9%
Angry 1.9%
Calm 1%
Surprised 1%
Fear 0.9%

AWS Rekognition

Age 45-51
Gender Male, 98.8%
Happy 56.8%
Calm 39.6%
Confused 1.1%
Surprised 0.7%
Sad 0.7%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Male, 88.1%
Sad 31.5%
Calm 30.7%
Angry 12.1%
Happy 10.9%
Fear 8%
Disgusted 3.2%
Confused 2.4%
Surprised 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.2%
Person 98.9%
Person 97.2%
Person 97%
Person 95.4%
Person 93.4%
Person 72.7%
Shoe 57.6%
Shoe 51.1%

Categories

Imagga

interior objects 98.8%

Text analysis

Amazon

22
ГОД
YE33A2

Google

TOA YT33A2
TOA
YT33A2