Human Generated Data

Title

Untitled (man and four women sitting at tables at Christmas ball)

Date

1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9680

Human Generated Data

Title

Untitled (man and four women sitting at tables at Christmas ball)

People

Artist: Martin Schweig, American 20th century

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9680

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Sitting 98.8
Person 98.5
Person 98
Person 97.8
Crowd 93.2
Room 91.4
Indoors 91.4
Audience 90.6
Furniture 75.8
Text 68.2
Speech 64.2
Portrait 62.7
Face 62.7
Photography 62.7
Photo 62.7
People 62
Table 57.8
Chair 57.3
Lecture 57
Press Conference 56.4
Clothing 56.2
Apparel 56.2

Clarifai
created on 2023-10-27

people 99.1
woman 98.2
adult 98
desk 98
furniture 97.6
man 97.6
group 97.4
group together 96.8
sit 96.7
monochrome 95.4
room 94.7
sitting 94
chair 93.5
indoors 93.1
three 86.8
table 86.7
league 85.9
administration 83.4
four 83
office 81.3

Imagga
created on 2022-01-23

classroom 56
room 52.8
office 44.9
business 38.3
man 37
table 36.8
businessman 36.2
laptop 34.7
meeting 32
male 31.9
people 31.8
computer 31.7
person 31
group 26.6
businesswoman 26.4
adult 25.8
desk 25.7
sitting 24.9
executive 24.9
work 24.3
professional 24.3
team 24.2
corporate 24.1
indoors 22
education 21.7
smiling 20.3
businesspeople 19.9
teamwork 19.5
happy 18.8
teacher 18.6
working 18.6
communication 18.5
women 18.2
manager 17.7
confident 17.3
talking 17.1
indoor 16.4
worker 16.1
looking 16
together 15.8
smile 15.7
director 15.1
colleagues 14.6
chair 14.4
home 14.4
suit 13.6
technology 13.4
modern 13.3
job 13.3
student 13.1
men 12.9
successful 12.8
conference 12.7
partners 12.6
workplace 12.4
couple 12.2
notebook 12.1
mature 12.1
discussion 11.7
interior 11.5
showing 11.3
horizontal 10.9
musical instrument 10.7
corporation 10.6
class 10.6
collar 10.6
coffee 10.2
lifestyle 10.1
handsome 9.8
businessperson 9.8
pen 9.4
happiness 9.4
learning 9.4
study 9.3
two 9.3
presentation 9.3
hand 9.1
board 9.1
blackboard 9
cheerful 8.9
collaboration 8.9
success 8.9
discussing 8.8
coworkers 8.8
wind instrument 8.8
monitor 8.7
expression 8.5
casual 8.5
friends 8.5
senior 8.4
company 8.4
house 8.4
document 8.4
color 8.3
holding 8.3
brass 8.2
seminar 7.9
paper 7.8
teaching 7.8
40s 7.8
portrait 7.8
college 7.6
adults 7.6
contemporary 7.5
clothing 7.4
training 7.4
phone 7.4
school 7.4
face 7.1
glass 7

Google
created on 2022-01-23

Furniture 93.6
Chair 89.8
Table 89.3
Style 83.8
Black-and-white 83
Monochrome 73.6
Monochrome photography 73.5
Room 71.7
Desk 71.6
Font 70.9
Art 70.4
Event 70.1
Sitting 68
Suit 66
Stock photography 62.9
Rectangle 60
Classic 58.8
Vintage clothing 57.1
Visual arts 56.6
History 55.4

Microsoft
created on 2022-01-23

wall 96.9
indoor 96.9
table 95
furniture 80.7
text 76.1
person 72.1
drawing 63
clothing 62.3
sketch 53.7
room 42.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 96.9%
Happy 69.9%
Sad 11.7%
Calm 9.3%
Surprised 3.6%
Confused 2.2%
Fear 1.3%
Disgusted 1.3%
Angry 0.7%

AWS Rekognition

Age 45-51
Gender Female, 86.3%
Happy 53.6%
Sad 26.2%
Calm 12.3%
Confused 2.6%
Surprised 1.9%
Fear 1.2%
Disgusted 1.2%
Angry 0.9%

AWS Rekognition

Age 20-28
Gender Female, 65.2%
Happy 90.3%
Calm 7%
Sad 0.9%
Surprised 0.6%
Confused 0.5%
Disgusted 0.2%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 31-41
Gender Female, 73.9%
Calm 88.6%
Sad 6%
Happy 3.7%
Confused 0.8%
Surprised 0.3%
Angry 0.2%
Fear 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 98.5%
Person 98%
Person 97.8%

Text analysis

Amazon

24035
١ ع د

Google

LIO 3 J MJI7-- YT37A°2 - - XAGON
LIO
3
J
MJI7--
YT37A°2
-
XAGON