Human Generated Data

Title

Untitled (man rolling out carpet)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19324

Human Generated Data

Title

Untitled (man rolling out carpet)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19324

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.2
Human 97.2
Flooring 96.5
Clothing 96.4
Apparel 96.4
Person 93.2
Floor 90.4
Mannequin 80.6
Indoors 73.8
Sleeve 70
Sitting 69.3
Room 63.8
Photography 59.1
Photo 59.1
Furniture 57.1
Long Sleeve 56.5
Coat 56

Clarifai
created on 2023-10-22

people 99.6
monochrome 98.6
indoors 98.2
man 97.7
room 97.7
adult 97.7
one 97
portrait 95.8
woman 93.3
two 92
black and white 91.3
street 90.6
furniture 89.6
wear 89
three 87.5
inside 87.1
chair 86.4
window 86.1
door 85.1
doorway 84.7

Imagga
created on 2022-03-05

crutch 26.5
boutique 25.6
staff 21.2
people 20.6
man 18.8
interior 18.6
male 16.3
stick 15.9
building 15.7
dress 15.3
bride 14.4
wedding 13.8
adult 13.5
happy 13.1
person 12.9
women 12.6
architecture 12
travel 12
city 11.6
urban 11.3
fashion 11.3
room 11
love 11
happiness 11
business 10.9
suit 10.8
light 10.7
groom 10.7
cleaner 10.4
home 10.4
church 10.2
house 10.1
history 9.8
old 9.7
blackboard 9.7
indoors 9.7
chair 9.6
standing 9.6
men 9.4
luxury 9.4
hand 9.1
modern 9.1
religion 9
sky 8.9
work 8.9
wall 8.9
businessman 8.8
professional 8.7
marriage 8.5
clothing 8.5
clothes 8.4
holding 8.2
tourism 8.2
smiling 8
lifestyle 7.9
office 7.9
couple 7.8
window 7.8
two 7.6
bell 7.6
bouquet 7.5
street 7.4
lady 7.3
new 7.3
life 7.3
black 7.2
device 7.2
celebration 7.2
portrait 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 96.8
statue 86.8
indoor 85.5
text 78.3
monochrome 78
white 63.8
clothing 53.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Female, 95.1%
Angry 42.1%
Sad 40%
Confused 8.9%
Calm 6.6%
Happy 0.9%
Fear 0.9%
Disgusted 0.4%
Surprised 0.3%

Feature analysis

Amazon

Person
Person 97.2%
Person 93.2%

Categories

Text analysis

Amazon

12
11
MAQOX
2.0