Human Generated Data

Title

Untitled (men in suits standing in police line-up, audience watching)

Date

1962

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15957

Human Generated Data

Title

Untitled (men in suits standing in police line-up, audience watching)

People

Artist: Jack Gould, American

Date

1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15957

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.2
Human 99.2
Chair 99.2
Furniture 99.2
Chair 99.1
Audience 98.5
Crowd 98.5
Person 98.4
Person 98.3
Person 96.5
Chair 96.5
Person 96.1
Person 92.1
Speech 91.4
Person 90.5
Person 90.3
Lecture 89.2
Person 87.6
Indoors 86.1
Person 85.8
Chair 80.9
Room 80.2
Person 73.8
Interior Design 71.3
Screen 61.6
Electronics 61.6
Person 60.3
Monitor 58.6
Display 58.6
Press Conference 56.6
Person 49.6

Clarifai
created on 2023-10-29

people 98.6
meeting 98.5
chair 98
room 97.9
indoors 95.4
Congress 94.4
league 94.2
group 94
lecture 93.5
education 90.2
airport 90.1
man 89.8
seminar 89.8
leader 88.9
business 88.4
administration 87.5
group together 87.3
school 86.4
convention 86.2
many 84.5

Imagga
created on 2022-02-05

room 64.3
classroom 47.5
interior 44.2
restaurant 40.6
chair 37.6
hall 36
table 34.6
building 28.3
indoors 28.1
modern 28
furniture 27.7
office 23.5
cafeteria 23
business 21.3
design 20.8
center 20.8
people 19
house 18.4
indoor 18.3
floor 17.7
man 17.5
light 17.4
work 17.3
structure 16.9
chairs 16.6
window 16.6
empty 16.3
home 15.9
inside 15.6
person 15.3
decor 15
glass 14.8
seat 13.9
corporate 13.7
comfortable 13.4
wood 13.3
dining 13.3
architecture 13.3
wall 12.8
businessman 12.4
meeting 12.3
male 12.1
professional 11.9
tables 11.8
teacher 11.8
communication 11.8
conference 11.7
lifestyle 11.6
computer 11.4
desk 11.3
speaker 11.2
sitting 11.2
luxury 11.1
decoration 10.9
hotel 10.5
urban 10.5
lamp 10.5
group 10.5
contemporary 10.3
dinner 10.1
relaxation 10
adult 9.8
food 9.7
corporation 9.7
style 9.6
education 9.5
living 9.5
3d 9.3
elegance 9.2
stylish 9
team 9
working 8.8
women 8.7
sofa 8.6
men 8.6
space 8.5
relax 8.4
teamwork 8.3
board 8.1
suit 8.1
entrepreneur 8.1
kitchen 8
render 7.8
residential 7.7
life 7.6
articulator 7.6
horizontal 7.5
learning 7.5
shop 7.5
coffee 7.4
occupation 7.3
smiling 7.2
night 7.1
job 7.1

Google
created on 2022-02-05

Furniture 94.4
Chair 92.8
Event 73.6
Suit 71.5
Projection screen 69.4
Conference hall 65
Job 62.8
Public speaking 62.5
Room 62.5
Audience 58.6
Collaboration 57
Sitting 56.9
Presentation 56.8
Auditorium 56.8
Display device 55.4
Training 54.1
Seminar 54
Meeting 53.5
Class 52.9
Hall 52.9

Microsoft
created on 2022-02-05

person 96.7
furniture 95.6
clothing 95.6
chair 95.4
man 88.9
computer 88.4
ceiling 81.4
people 75.5
conference hall 69.7
laptop 68.9
group 60.9
table 52.8
convention 51.8
line 18.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Calm 62.2%
Disgusted 28.6%
Confused 2.2%
Happy 2.1%
Angry 1.8%
Sad 1.1%
Surprised 1%
Fear 0.9%

AWS Rekognition

Age 30-40
Gender Male, 64.8%
Happy 48.2%
Calm 40.6%
Surprised 2.7%
Sad 2.3%
Angry 2%
Disgusted 1.8%
Fear 1.5%
Confused 0.8%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 96.1%
Happy 3.5%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 93.5%
Surprised 2.8%
Disgusted 1.2%
Confused 0.6%
Fear 0.5%
Sad 0.5%
Happy 0.5%
Angry 0.5%

AWS Rekognition

Age 20-28
Gender Female, 50.8%
Calm 90.7%
Happy 8.3%
Surprised 0.4%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 19-27
Gender Female, 58.8%
Calm 94.1%
Sad 2.3%
Fear 2.3%
Happy 0.5%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Confused 0.2%

AWS Rekognition

Age 19-27
Gender Female, 56%
Calm 87.8%
Fear 4.8%
Sad 2.4%
Surprised 1.5%
Disgusted 1.2%
Happy 0.8%
Angry 0.8%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.2%
Person 98.4%
Person 98.3%
Person 96.5%
Person 96.1%
Person 92.1%
Person 90.5%
Person 90.3%
Person 87.6%
Person 85.8%
Person 73.8%
Person 60.3%
Person 49.6%
Chair 99.2%
Chair 99.1%
Chair 96.5%
Chair 80.9%

Categories

Imagga

events parties 99.3%

Text analysis

Amazon

4
C
LIFW
Y 133492