Human Generated Data

Title

Untitled (two men holding two children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16938

Human Generated Data

Title

Untitled (two men holding two children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16938

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Person 98.4
Person 96.9
Person 94.3
Furniture 91.5
People 86.5
Room 85.1
Indoors 85.1
Living Room 84.1
Interior Design 83.5
Tie 80.7
Accessories 80.7
Accessory 80.7
Lamp 80.1
Table Lamp 80
Couch 77.2
Monitor 76.9
Electronics 76.9
Display 76.9
Screen 76.9
Chair 76.2
Clothing 69.3
Apparel 69.3
Shorts 59.6
Home Decor 57.1
Sitting 55.6
Suit 55.5
Coat 55.5
Overcoat 55.5

Clarifai
created on 2023-10-28

people 99.9
group 98.7
child 98.1
group together 97.9
man 97.8
adult 96.7
boy 93.4
monochrome 93.3
woman 88.5
several 88.4
furniture 87.7
three 87.2
recreation 86.1
four 84.7
outfit 84.7
education 83.8
music 83.7
wear 83.3
baseball 79.8
interaction 78.1

Imagga
created on 2022-02-26

man 30.2
people 29
person 23.3
male 20.7
adult 20.4
indoors 19.3
home 18.3
work 16
medical 15.9
patient 15.4
chair 14.7
hospital 14.1
interior 13.3
nurse 13.2
happy 13.1
room 12.8
sitting 12
smiling 11.6
black 11.4
senior 11.2
occupation 11
father 10.9
medicine 10.6
human 10.5
doctor 10.3
men 10.3
women 10.3
professional 10
holding 9.9
family 9.8
health 9.7
working 9.7
business 9.7
together 9.6
couple 9.6
lifestyle 9.4
casual 9.3
old 9
musical instrument 8.7
love 8.7
illness 8.6
furniture 8.4
equipment 8.4
20s 8.2
dad 8.2
care 8.2
worker 8.2
smile 7.8
face 7.8
child 7.8
travel 7.7
laboratory 7.7
elderly 7.7
research 7.6
talking 7.6
instrument 7.6
hand 7.6
parent 7.6
leisure 7.5
surgeon 7.5
seat 7.4
cheerful 7.3
dress 7.2
portrait 7.1
science 7.1
wheelchair 7.1

Google
created on 2022-02-26

Photograph 94.2
Black 89.6
Black-and-white 84.3
Style 84
Monochrome 75.3
Snapshot 74.3
Room 73.5
Monochrome photography 72.8
Chair 72.4
Picture frame 71.9
Vintage clothing 68.8
T-shirt 66.5
Event 65.4
Curtain 64.1
Stock photography 63.7
Sitting 59.3
Child 58.2
Font 57.2
Team 57.1
Service 56.5

Microsoft
created on 2022-02-26

text 97.7
person 97.7
furniture 93.4
chair 70.1
house 54.9
clothing 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Male, 98%
Calm 63.6%
Angry 10%
Surprised 9.8%
Disgusted 5.8%
Fear 4.7%
Happy 3.1%
Sad 2.1%
Confused 0.9%

AWS Rekognition

Age 22-30
Gender Male, 98.4%
Calm 81.3%
Sad 8.2%
Happy 6.6%
Confused 2%
Disgusted 0.7%
Angry 0.6%
Surprised 0.5%
Fear 0.2%

AWS Rekognition

Age 48-56
Gender Male, 89.6%
Calm 57.5%
Happy 39%
Sad 0.9%
Surprised 0.8%
Confused 0.7%
Disgusted 0.6%
Angry 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Suit
Person 99.3%
Person 98.4%
Person 96.9%
Person 94.3%
Tie 80.7%
Suit 55.5%

Text analysis

Amazon

SALE

Google

MHYT3RA2 A
MHYT3RA2
A