Human Generated Data

Title

Untitled (three men playing instruments in fancy carpeted room)

Date

1941

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9029

Human Generated Data

Title

Untitled (three men playing instruments in fancy carpeted room)

People

Artist: Martin Schweig, American 20th century

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9029

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Person 98.8
Person 97.1
Floor 90.5
Clothing 87.2
Apparel 87.2
Furniture 87.2
Chair 85.8
Flooring 81
Building 73.8
Shelter 71.3
Nature 71.3
Countryside 71.3
Rural 71.3
Outdoors 71.3
Path 69
Female 68.9
Sitting 67.1
Architecture 65.3
Photography 65.2
Photo 65.2
Meal 62.1
Food 62.1
Girl 60.1
Leisure Activities 58.7
Chair 57.6
Guitar 56.8
Musical Instrument 56.8
Urban 55.3
Woman 55.1

Clarifai
created on 2023-10-26

people 99.8
man 98.4
adult 97.7
group together 95.9
woman 95.9
group 94.8
music 93.9
musician 88.2
education 86.3
child 84.2
indoors 84
chair 84
two 83.2
sitting 82.6
recreation 81.5
many 81.2
actor 80.8
wear 78.8
leader 77.9
veil 77.5

Imagga
created on 2022-01-23

vibraphone 47.1
percussion instrument 45.4
musical instrument 43.7
chair 35.6
device 30.2
man 28.2
business 24.9
people 24
male 22
seat 19.9
businessman 18.5
person 16.7
sitting 16.3
office 16.3
rocking chair 15.8
adult 15.6
shopping cart 15.1
work 14.5
room 14.5
table 13.8
group 13.7
interior 13.3
silhouette 13.2
men 12.9
modern 12.6
meeting 12.2
couple 12.2
corporate 12
women 11.9
harp 11.7
suit 11.7
lifestyle 11.6
building 11.4
newspaper 11.3
handcart 11.3
manager 11.2
professional 10.7
happy 10.7
stringed instrument 10.5
furniture 10.3
black 10.2
glass 10.1
communication 10.1
urban 9.6
laptop 9.5
day 9.4
floor 9.3
support 9.2
window 9.2
city 9.1
product 9.1
portrait 9.1
outdoors 9
job 8.8
working 8.8
indoors 8.8
life 8.5
wheeled vehicle 8.5
attractive 8.4
relaxation 8.4
company 8.4
leisure 8.3
computer 8.2
indoor 8.2
restaurant 8.2
vacation 8.2
technology 8.2
team 8.1
smile 7.8
happiness 7.8
architecture 7.8
education 7.8
summer 7.7
two 7.6
executive 7.6
relax 7.6
fashion 7.5
holding 7.4
style 7.4
coffee 7.4
light 7.4
cheerful 7.3
worker 7.3
marimba 7.2
transportation 7.2
holiday 7.2
travel 7
together 7
creation 7

Google
created on 2022-01-23

Furniture 93.3
Chair 91.3
Black-and-white 87.7
Gesture 85.3
Style 84.1
Monochrome 79.6
Adaptation 79.3
People 77.8
Monochrome photography 76.8
Table 74.8
Event 72.7
Rectangle 71.8
Suit 70.7
Room 69
Sitting 68.5
Vintage clothing 65.9
Flooring 65.7
Art 64.8
Curtain 64.7
Happy 64

Microsoft
created on 2022-01-23

person 78.1
clothing 73.2
black and white 65.2
man 57.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 94.8%
Calm 98.7%
Happy 0.4%
Disgusted 0.3%
Surprised 0.3%
Sad 0.2%
Angry 0.1%
Fear 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Chair 85.8%
Guitar 56.8%

Text analysis

Amazon

NAMTSA3
NAMTSA3 رواية
رواية

Google

MAMTCAS
MAMTCAS