Human Generated Data

Title

Untitled (portrait of children inside house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17420

Human Generated Data

Title

Untitled (portrait of children inside house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17420

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 98.2
Person 97
Person 96.7
Person 95.6
Shoe 92.2
Footwear 92.2
Clothing 92.2
Apparel 92.2
Interior Design 89
Indoors 89
Person 79.1
Room 74
People 66.2
Drawing 64.6
Art 64.6
Kid 63.7
Child 63.7
Girl 60.7
Female 60.7
Face 59.7
Living Room 56.9
Suit 55.4
Coat 55.4
Overcoat 55.4
Shoe 53.3

Clarifai
created on 2023-10-28

people 100
group 99.6
group together 99.4
child 99.1
many 98.4
recreation 96.9
adult 96.8
man 96.8
several 96.5
woman 95.8
street 94.5
boy 94.2
family 90.8
wear 89.3
monochrome 88.3
adolescent 86.4
music 85.9
sibling 84
education 83.1
interaction 80.6

Imagga
created on 2022-02-26

people 27.3
shop 22.8
person 22.1
man 20.2
portrait 20
case 19.1
ball 18.4
adult 18.2
toyshop 17.5
women 17.4
fashion 16.6
male 16.4
negative 15.9
black 14.4
human 14.2
film 14
sport 13.8
mercantile establishment 13.7
girls 13.7
dress 13.5
world 12.9
lifestyle 12.3
urban 12.2
play 12.1
city 11.6
indoors 11.4
group 11.3
men 11.2
soccer ball 11
active 10.8
activity 10.7
game 10.7
attractive 10.5
sexy 10.4
body 10.4
motion 10.3
athlete 10.2
casual 10.2
player 10.1
photographic paper 10.1
happy 10
exercise 10
fitness 9.9
team 9.8
equipment 9.8
style 9.6
home 9.6
standing 9.6
walking 9.5
model 9.3
speed 9.2
competition 9.1
place of business 9.1
room 9
fun 9
family 8.9
posing 8.9
window 8.8
hair 8.7
clothing 8.7
athletic 8.6
game equipment 8.5
legs 8.5
head 8.4
leisure 8.3
vintage 8.3
retro 8.2
life 8.1
lady 8.1
recreation 8.1
art 7.9
business 7.9
smile 7.8
happiness 7.8
face 7.8
luxury 7.7
pretty 7.7
old 7.7
health 7.6
healthy 7.5
one 7.5
professional 7.4
event 7.4
decoration 7.3
indoor 7.3
sensuality 7.3
interior 7.1

Google
created on 2022-02-26

Building 87.1
Style 84
Black-and-white 83.8
Art 79.1
Window 77.7
Monochrome 73.8
Monochrome photography 72.2
Event 69.9
Room 69.4
Visual arts 66.7
Glass 65.7
Stock photography 64.3
Font 61.9
Facade 59.5
Vintage clothing 58.9
Illustration 56.6
Sitting 53.9
Fun 53.6
Collection 50.4

Microsoft
created on 2022-02-26

text 93.7
person 86.1
clothing 84.4
drawing 83.7
toddler 65.3
old 49.9
posing 37.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 81.4%
Happy 78.4%
Calm 10%
Surprised 6.6%
Sad 2.8%
Confused 0.7%
Disgusted 0.6%
Fear 0.5%
Angry 0.5%

AWS Rekognition

Age 41-49
Gender Male, 91.3%
Happy 53.6%
Sad 45.2%
Calm 0.4%
Surprised 0.2%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 40-48
Gender Male, 91.8%
Happy 93.3%
Calm 4.5%
Surprised 0.9%
Disgusted 0.4%
Confused 0.4%
Sad 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 79.3%
Calm 70.4%
Happy 25%
Sad 3%
Confused 0.6%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 18-26
Gender Male, 99%
Surprised 60.1%
Calm 31.7%
Sad 3.1%
Confused 1.3%
Disgusted 1.1%
Fear 0.9%
Happy 0.9%
Angry 0.9%

AWS Rekognition

Age 33-41
Gender Female, 77.7%
Happy 33.8%
Angry 24.7%
Calm 16.7%
Sad 14.1%
Surprised 3.2%
Disgusted 3.1%
Fear 2.8%
Confused 1.7%

AWS Rekognition

Age 25-35
Gender Male, 75.2%
Sad 45.3%
Surprised 31.2%
Fear 5.5%
Calm 4.8%
Confused 4.8%
Angry 4.7%
Happy 2%
Disgusted 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.5%
Person 98.2%
Person 97%
Person 96.7%
Person 95.6%
Person 79.1%
Shoe 92.2%
Shoe 53.3%

Categories

Imagga

interior objects 99.5%

Text analysis

Amazon

2
KODOK - 2013