Human Generated Data

Title

Untitled (large family portrait)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17257

Human Generated Data

Title

Untitled (large family portrait)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17257

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.4
Human 99.4
Person 99.3
Person 98.3
Person 98.1
Clothing 97.9
Apparel 97.9
Person 97.2
Person 96.7
Person 96.4
Person 96.2
Shorts 95.9
Tie 95
Accessories 95
Accessory 95
Person 85.4
People 82.1
Person 81.7
Person 66.7
Suit 64
Coat 64
Overcoat 64
Shoe 59.9
Footwear 59.9
Shoe 59.9
Female 59.5
Floor 57.1

Clarifai
created on 2023-10-29

people 99.8
group 99.7
education 98.5
many 98.2
group together 97.6
woman 96.9
child 96.7
adult 96.4
school 95.9
man 95.6
teacher 92.9
boy 91
elementary school 89.5
adolescent 88.6
several 87.2
administration 84.5
wear 84.5
music 84.1
leader 82.5
rehearsal 79.3

Imagga
created on 2022-02-26

people 29.5
person 28
sport 25.6
adult 23.4
teacher 22.3
man 21.5
male 21.3
men 20.6
group 19.3
lifestyle 17.3
professional 16.8
athlete 16.6
black 16.2
business 15.8
dancer 15.6
silhouette 14.9
educator 14.3
businessman 14.1
ball 13.9
couple 13.9
women 13.4
happy 13.2
active 12.9
performer 12.6
life 12.1
player 12
portrait 11.6
outdoor 11.5
boy 10.4
supporter 10.4
girls 10
exercise 10
leisure 10
fun 9.7
success 9.7
bride 9.6
entertainer 9.4
smiling 9.4
happiness 9.4
competition 9.1
attractive 9.1
contestant 9.1
holding 9.1
bathing cap 9.1
fitness 9
team 9
outdoors 9
clothing 8.9
job 8.8
dance 8.7
love 8.7
runner 8.6
child 8.6
youth 8.5
friends 8.5
friendship 8.4
room 8.2
employee 8.1
recreation 8.1
body 8
education 7.8
corporate 7.7
run 7.7
crowd 7.7
casual 7.6
two 7.6
human 7.5
office 7.2
looking 7.2
cap 7.1
handsome 7.1
game 7.1
summer 7.1

Google
created on 2022-02-26

Photograph 94.2
Black 89.7
Style 83.9
Black-and-white 82.2
Shorts 82
Suit 79
Font 78.4
Monochrome 75.9
Snapshot 74.3
Vintage clothing 73.8
Window 73.6
Team 72.3
Building 72.1
Event 72
Crew 69.5
Monochrome photography 67.9
History 63.7
Stock photography 63.6
Room 63.4
Art 63.3

Microsoft
created on 2022-02-26

person 99
clothing 93.1
text 91.4
outdoor 86.6
man 65.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 96.9%
Happy 87.3%
Calm 5.2%
Fear 3.1%
Surprised 1.7%
Sad 1.4%
Angry 0.6%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 30-40
Gender Male, 96%
Calm 41%
Sad 25.4%
Happy 21.6%
Confused 4.2%
Disgusted 2.6%
Angry 2.4%
Surprised 1.6%
Fear 1.3%

AWS Rekognition

Age 42-50
Gender Female, 55.7%
Sad 83.6%
Calm 12.3%
Disgusted 1%
Confused 0.8%
Happy 0.8%
Angry 0.8%
Fear 0.5%
Surprised 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Happy 89.6%
Surprised 7.5%
Calm 1.1%
Angry 0.9%
Sad 0.3%
Confused 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 34-42
Gender Male, 91.5%
Calm 82.9%
Confused 6.7%
Sad 3.6%
Surprised 2.7%
Happy 1.9%
Fear 1%
Angry 0.7%
Disgusted 0.5%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 71.3%
Happy 15.4%
Sad 5.4%
Surprised 4.7%
Disgusted 1%
Fear 0.8%
Confused 0.8%
Angry 0.7%

AWS Rekognition

Age 49-57
Gender Male, 98.4%
Happy 52.5%
Sad 24.2%
Calm 11.3%
Surprised 4.3%
Confused 3.8%
Disgusted 1.7%
Fear 1.2%
Angry 0.9%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Sad 97.8%
Calm 1.5%
Happy 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 28-38
Gender Female, 98%
Sad 64.4%
Calm 34.1%
Disgusted 0.3%
Angry 0.3%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Shoe
Person 99.4%
Person 99.3%
Person 98.3%
Person 98.1%
Person 97.2%
Person 96.7%
Person 96.4%
Person 96.2%
Person 85.4%
Person 81.7%
Person 66.7%
Tie 95%
Shoe 59.9%
Shoe 59.9%

Categories

Text analysis

Amazon

YTERAS
KACOX
289