Human Generated Data

Title

Untitled (boy and girl sitting in living room with dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16924

Human Generated Data

Title

Untitled (boy and girl sitting in living room with dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16924

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99
Human 99
Person 98.9
Furniture 94
Chair 87.1
Shoe 76.3
Footwear 76.3
Clothing 76.3
Apparel 76.3
Person 70.3
Room 69.2
Indoors 69.2
Mannequin 57.3
Person 41.7

Clarifai
created on 2023-10-28

people 99.9
group 98.4
monochrome 97.3
child 96.2
two 96.1
adult 95.7
man 95.6
woman 94.1
family 94
music 93.7
actor 93.6
actress 92.6
three 92.5
wear 92
outfit 91.6
indoors 90.9
nostalgia 90.6
musician 88.9
leader 87.9
chair 87

Imagga
created on 2022-02-26

barbershop 69.6
shop 55.5
mercantile establishment 43.3
man 34.3
people 29.5
place of business 29
male 26.9
person 26.9
men 20.6
chair 19.5
adult 19.4
room 19
businessman 17.6
business 17.6
blackboard 17.6
office 15.7
portrait 15.5
establishment 14.5
casual 14.4
women 12.6
work 12.5
indoors 12.3
teacher 12.1
looking 12
home 12
indoor 11.9
patient 11.3
back 11
two 11
communication 10.9
dress 10.8
happy 10.6
interior 10.6
black 10.2
happiness 10.2
lifestyle 10.1
window 10.1
smile 10
board 9.9
case 9.9
family 9.8
human 9.7
job 9.7
professional 9.7
life 9.6
smiling 9.4
health 9
sexy 8.8
computer 8.8
nurse 8.7
couple 8.7
check 8.7
uniform 8.7
modern 8.4
pretty 8.4
silhouette 8.3
barber chair 8.3
holding 8.2
girls 8.2
working 8
sitting 7.7
attractive 7.7
bride 7.7
finance 7.6
house 7.5
clinic 7.5
cheerful 7.3
furniture 7.2
team 7.2
handsome 7.1
medical 7.1

Microsoft
created on 2022-02-26

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 91.1%
Calm 55%
Surprised 27.5%
Happy 11.5%
Sad 2.2%
Fear 1.8%
Disgusted 0.8%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 83.7%
Surprised 14.4%
Sad 0.6%
Disgusted 0.5%
Confused 0.3%
Angry 0.2%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99%
Person 98.9%
Person 70.3%
Person 41.7%
Shoe 76.3%

Text analysis

Amazon

8
-MACON
утазла -MACON
утазла